<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Array resizing performance	</title>
	<atom:link href="https://undocumentedmatlab.com/articles/array-resizing-performance/feed" rel="self" type="application/rss+xml" />
	<link>https://undocumentedmatlab.com/articles/array-resizing-performance?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=array-resizing-performance</link>
	<description>Professional Matlab consulting, development and training</description>
	<lastBuildDate>Thu, 11 Jul 2013 22:30:14 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.2</generator>
	<item>
		<title>
		By: Yair Altman		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-223829</link>

		<dc:creator><![CDATA[Yair Altman]]></dc:creator>
		<pubDate>Thu, 11 Jul 2013 22:30:14 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-223829</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://undocumentedmatlab.com/articles/array-resizing-performance#comment-223824&quot;&gt;Ricky&lt;/a&gt;.

@Ricky - 11K x 4090 x 8 = ~343 MB (not 200 MB as you said). And this memory needs to be contiguous, since its a regular numeric matrix. Even if you have this much memory available, it is quite possible that the largest &lt;u&gt;contiguous&lt;/u&gt; block of memory is smaller than 343 MB. If you need to do this for 2 other arrays of the same size, you only make things worse...]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://undocumentedmatlab.com/articles/array-resizing-performance#comment-223824">Ricky</a>.</p>
<p>@Ricky &#8211; 11K x 4090 x 8 = ~343 MB (not 200 MB as you said). And this memory needs to be contiguous, since its a regular numeric matrix. Even if you have this much memory available, it is quite possible that the largest <u>contiguous</u> block of memory is smaller than 343 MB. If you need to do this for 2 other arrays of the same size, you only make things worse&#8230;</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Ricky		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-223824</link>

		<dc:creator><![CDATA[Ricky]]></dc:creator>
		<pubDate>Thu, 11 Jul 2013 22:16:11 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-223824</guid>

					<description><![CDATA[Hello Yair, 

I was reading through your posts about memory management and pre-allocation of arrays and found them very interesting. Currently I am stuck in a very peculiar situation. My input data file is a .mat file of size 52 MB. I load the file in MATLAB using the &lt;i&gt;&lt;b&gt;load&lt;/b&gt;&lt;/i&gt; command. I scan through the file and extract my data points out of this. I get 11000 rows and each row has 4090 columns. I preallocate a large array using &lt;i&gt;&lt;b&gt;zeros&lt;/b&gt;&lt;/i&gt; command. I know my data type is double and it uses 8 bytes. Total bytes used for above array is 200 MB. For further processing of my data I have to allocate 2 more arrays of the same previous rows and columns. 

So while doing this I run out of memory how can I free up the previous memory allocated and how can I solve this problem.]]></description>
			<content:encoded><![CDATA[<p>Hello Yair, </p>
<p>I was reading through your posts about memory management and pre-allocation of arrays and found them very interesting. Currently I am stuck in a very peculiar situation. My input data file is a .mat file of size 52 MB. I load the file in MATLAB using the <i><b>load</b></i> command. I scan through the file and extract my data points out of this. I get 11000 rows and each row has 4090 columns. I preallocate a large array using <i><b>zeros</b></i> command. I know my data type is double and it uses 8 bytes. Total bytes used for above array is 200 MB. For further processing of my data I have to allocate 2 more arrays of the same previous rows and columns. </p>
<p>So while doing this I run out of memory how can I free up the previous memory allocated and how can I solve this problem.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Norman		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-88155</link>

		<dc:creator><![CDATA[Norman]]></dc:creator>
		<pubDate>Wed, 30 May 2012 09:33:18 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-88155</guid>

					<description><![CDATA[Anyone offering a benchmark suite to check the different gains of all mentioned methods? As explained the actual benefit depends on the Matlab version and I have to use Matlab 2007b ... 
I tried some on my own and it seems there are differences to the published results.

(I have to use 2007b because the symbolic toolbox changed and I only get errors now with newer versions :-( ).]]></description>
			<content:encoded><![CDATA[<p>Anyone offering a benchmark suite to check the different gains of all mentioned methods? As explained the actual benefit depends on the Matlab version and I have to use Matlab 2007b &#8230;<br />
I tried some on my own and it seems there are differences to the published results.</p>
<p>(I have to use 2007b because the symbolic toolbox changed and I only get errors now with newer versions 🙁 ).</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: naor		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87479</link>

		<dc:creator><![CDATA[naor]]></dc:creator>
		<pubDate>Thu, 24 May 2012 18:47:13 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-87479</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87377&quot;&gt;naor&lt;/a&gt;.

@Yair, thanks for the link.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87377">naor</a>.</p>
<p>@Yair, thanks for the link.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87443</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Thu, 24 May 2012 11:48:59 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-87443</guid>

					<description><![CDATA[I think there is a small but important difference between what employees of TMW advise and what TMW advises. In general the employees of TMW are clear when they are and are not speaking for TMW. I wish TMW would make more &quot;statements&quot; since employees are obviously very conservative about not revealing IP.]]></description>
			<content:encoded><![CDATA[<p>I think there is a small but important difference between what employees of TMW advise and what TMW advises. In general the employees of TMW are clear when they are and are not speaking for TMW. I wish TMW would make more &#8220;statements&#8221; since employees are obviously very conservative about not revealing IP.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Yair Altman		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87384</link>

		<dc:creator><![CDATA[Yair Altman]]></dc:creator>
		<pubDate>Thu, 24 May 2012 00:30:24 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-87384</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87377&quot;&gt;naor&lt;/a&gt;.

@Naor - sorry, I don&#039;t know for sure. I assume that at least part of the warm-up time is indeed related to the data size, because the data needs to be allocated in memory and then loaded onto the CPU cache. Once it&#039;s in the cache, data access times will be much faster. 

Other warm-up factors include function compilation time (which is more-or-less constant), disk caching for the relevant m-files, and Matlab engine being swapped-in from virtual memory and other similar aspects that are dynamic in nature. If there&#039;s any I/O involved in the function, then I/O caching may also come into play here. 

Former MathWorker Bill McKeeman says in his &lt;a target=&quot;_blank&quot; href=&quot;http://www.mathworks.com/matlabcentral/fileexchange/18510-matlab-performance-measurement&quot; rel=&quot;nofollow&quot;&gt;Performance Tuning document&lt;/a&gt; that he usually throws the initial 3 measurements away when profiling - read his document, it&#039;s illuminating.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87377">naor</a>.</p>
<p>@Naor &#8211; sorry, I don&#8217;t know for sure. I assume that at least part of the warm-up time is indeed related to the data size, because the data needs to be allocated in memory and then loaded onto the CPU cache. Once it&#8217;s in the cache, data access times will be much faster. </p>
<p>Other warm-up factors include function compilation time (which is more-or-less constant), disk caching for the relevant m-files, and Matlab engine being swapped-in from virtual memory and other similar aspects that are dynamic in nature. If there&#8217;s any I/O involved in the function, then I/O caching may also come into play here. </p>
<p>Former MathWorker Bill McKeeman says in his <a target="_blank" href="http://www.mathworks.com/matlabcentral/fileexchange/18510-matlab-performance-measurement" rel="nofollow">Performance Tuning document</a> that he usually throws the initial 3 measurements away when profiling &#8211; read his document, it&#8217;s illuminating.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: naor		</title>
		<link>https://undocumentedmatlab.com/articles/array-resizing-performance#comment-87377</link>

		<dc:creator><![CDATA[naor]]></dc:creator>
		<pubDate>Wed, 23 May 2012 23:43:00 +0000</pubDate>
		<guid isPermaLink="false">http://undocumentedmatlab.com/?p=2949#comment-87377</guid>

					<description><![CDATA[Very interesting post! I wonder though how I will ever remember those issues that only arise so infrequently. I bet by the next time I need to grow an array dynamically I will have forgotten all about this. But maybe the Mathworks JIT team will remember.

The JIT discussion reminds me of a very important question I have had for many years regarding Matlab&#039;s function &quot;warm-up&quot; phenomenon. Everyone knows, I&#039;m sure, that the first time you run a function in Matlab it takes a lot longer than the next run through. There are some obvious reasons for this and quite likely other not obvious reasons. But we all know to &quot;warm up&quot; a function before benchmarking it. The question I have not been able to answer is do we need to worry about warm-up before our production runs on large datasets too. In other words, is the first-run slow-down simply constant overhead or does it grow with problem size is some way? If so, what constitutes a &quot;first&quot; run? First time in a Matlab session? After a &lt;code&gt; clear all &lt;/code&gt; command? Something else?

I guess it&#039;s not very directly related to this post, but I thought Yair might know something about this.

Thanks,
-n]]></description>
			<content:encoded><![CDATA[<p>Very interesting post! I wonder though how I will ever remember those issues that only arise so infrequently. I bet by the next time I need to grow an array dynamically I will have forgotten all about this. But maybe the Mathworks JIT team will remember.</p>
<p>The JIT discussion reminds me of a very important question I have had for many years regarding Matlab&#8217;s function &#8220;warm-up&#8221; phenomenon. Everyone knows, I&#8217;m sure, that the first time you run a function in Matlab it takes a lot longer than the next run through. There are some obvious reasons for this and quite likely other not obvious reasons. But we all know to &#8220;warm up&#8221; a function before benchmarking it. The question I have not been able to answer is do we need to worry about warm-up before our production runs on large datasets too. In other words, is the first-run slow-down simply constant overhead or does it grow with problem size is some way? If so, what constitutes a &#8220;first&#8221; run? First time in a Matlab session? After a <code> clear all </code> command? Something else?</p>
<p>I guess it&#8217;s not very directly related to this post, but I thought Yair might know something about this.</p>
<p>Thanks,<br />
-n</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
