<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Kingston Ultimate GT 2TB Flash Drive Review &#8211; All that Space!	</title>
	<atom:link href="https://www.thessdreview.com/hardware/flash-drives/kingston-ultimate-gt-2tb-flash-drive-review-space/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.thessdreview.com/hardware/flash-drives/kingston-ultimate-gt-2tb-flash-drive-review-space/</link>
	<description>The Worlds Dedicated SSD Education and Review Resource &#124;</description>
	<lastBuildDate>Wed, 12 Apr 2017 03:58:00 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>
		By: Greg Zeng		</title>
		<link>https://www.thessdreview.com/hardware/flash-drives/kingston-ultimate-gt-2tb-flash-drive-review-space/#comment-23970</link>

		<dc:creator><![CDATA[Greg Zeng]]></dc:creator>
		<pubDate>Wed, 12 Apr 2017 03:58:00 +0000</pubDate>
		<guid isPermaLink="false">https://www.thessdreview.com/?p=95497#comment-23970</guid>

					<description><![CDATA[Running these flash drives on my (old, 2013) Dell XPS-15 notebook gives variable results.  Each of the notebook&#039;s three USB 3.0 drives has consistently different benchmark results.  Probably because the hardware, etc is so worn-out, now after four years.

Crystal Disk Benchmark is the quickest, easiest to run.  Because the app is &quot;upgrading&quot; a few times yearly, I generally use the latest version, oping that the results do not vary.  Always on the same &#038; fastest USB 3.0 hardware port, without any other CPU demands when benching.

Even then I find Crystal Disk Benchmark varies a few percent, each run.  Single-run, or multiple run; it varies.  The factory default is the average of five runs.  Even this varies, each five-run session.

The big advantage of Crystal Disk Benchmark is that it has a data-base ready file save of results (cut &#038; paste from the htm file), as well as a GUI image summary of the reports.  Falsh drives seem to report similar results, independent of the file size afaik.  So quick tests are on five-runs of the smallest file size.  

Other speed i-o bench tests seem too fussy, etc imho.]]></description>
			<content:encoded><![CDATA[<p>Running these flash drives on my (old, 2013) Dell XPS-15 notebook gives variable results.  Each of the notebook&#8217;s three USB 3.0 drives has consistently different benchmark results.  Probably because the hardware, etc is so worn-out, now after four years.</p>
<p>Crystal Disk Benchmark is the quickest, easiest to run.  Because the app is &#8220;upgrading&#8221; a few times yearly, I generally use the latest version, oping that the results do not vary.  Always on the same &amp; fastest USB 3.0 hardware port, without any other CPU demands when benching.</p>
<p>Even then I find Crystal Disk Benchmark varies a few percent, each run.  Single-run, or multiple run; it varies.  The factory default is the average of five runs.  Even this varies, each five-run session.</p>
<p>The big advantage of Crystal Disk Benchmark is that it has a data-base ready file save of results (cut &amp; paste from the htm file), as well as a GUI image summary of the reports.  Falsh drives seem to report similar results, independent of the file size afaik.  So quick tests are on five-runs of the smallest file size.  </p>
<p>Other speed i-o bench tests seem too fussy, etc imho.</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
