Friday, May 22, 2009

ReSharper 4.5 Memory Consumption

Just tried ReSharper on a fairly large solution with about 38 projects (13 C# and 25 C++). With the ReSharper addin enabled the memory consumption is sitting at about 1.5 GByte for the solution. With the ReSharper addin disabled the memory consumption is sitting at about 300 MByte for the solution. The solution itself has over 90% of its source code in C++ (mix of managed and unmanaged) and only a smaller portion in C#.

JetBrains claim they have worked on the memory consumption of ReSharper 4.5. Looks to me as if more work is needed ....

On my 32 bit box I have to reboot once in a while because I'm running out of memory when the addin is renabled ... (Vista 32 bit, VS 2008, 4 GB RAM)



Update on 04 June 09: It looks as if the memory consumption goes up to the same value with R# disabled. This happens when IntelliSense creates/updates its database. However, with R# disabled the memory drops back to normal once IntelliSense has finished that activity. With R# enabled it appears as if IntelliSense doesn't free up the memory. So it looks as if the problem is caused by the combination of the two. A solution with just C# project (and hence without IntelliSense in C++) doesn't seem to have that issue either.
Since I am in contact with JetBrains at the moment let's see what they can find based on the info I can provide.


Update on 02 February 2010: I had some conversations with JetBrains and they have asked me to provide an example that show the behavior. The challenge is that is seams to happen (or become apparent) only when there is a large number of C++ projects in the solution and the entire C++ code base is significant as well. While I do have an example to demonstrate the behavior, I’m a bit challenged to provide the very example. Would you send your entire code base? At the moment some members of my team have switched off ReSharper when they are working with the solution that also contains C++ projects.


Also check out my blog on Agile Leadership.

Monday, May 18, 2009

A comment on McAfee

Right now McAfee is running a full system scan on my computer. Not only is it consuming a large amount of system resources - sometimes to the point of the system being unusable because of having to wait for some file operation to complete - there is also the following interesting 'feature'. Let's assume a full system scan is running. That seems to make sense since there are many different ways how viruses can find their way onto your computer. So that system scan is running, and you know that it will take an hour or more to complete. You have downloaded a file that you'd like to scan for viruses before you use it. Can you do that? Nope, not with the McAfee version I am currently using. It will display a dialog box telling me a different scan is running and that I have to cancel that one first before I can start a different one. My options are:
  1. Wait until McAfee has finished scanning my system.
  2. Cancel the system scan so I can scan the downloaded file
Option one makes me less productive. Option two has a potential security issue. None of the options meets my requirements. How about being able to explicitly scan files, e.g. from the context menu in the file explorer, regardless of whether other scans are running? I'll check whether other virus scanners work the same way or whether they behave differently. (Before someone asks: Automatic updates to keep McAfee up-to-date are enabled.)

Friday, May 15, 2009

I couldn't resist... - a quote from Subversion's web site

When assessing the memory consumption of the Subversion (SVN) 1.6.1 client, I also found the following on SVN's web site:
"all the memory we allocate will be cleaned up eventually" (Source: "Hacker's Guide To Subversion", retrieved 15 May 2009)
I like that quote since I think it is true for all software! "Eventually" all memory will be cleaned up, and if necessary when the process terminates. On second thought, though, here might be an opportunity for a new business model! What if we had one-way memory? You can allocate it once, and once a process has consumed it you need to buy new DRAM (or maybe it could be called OW-DRAM as in "One Way"-DRAM). I'm sure Intel and other chips vendors would love it! But seriously (and I'm sure I've got some funny things somewhere in my published text and code as well. Tell me!): A memory leak is a memory leak is a memory leak. Using add and commit for adding large amounts of files consumes about 1 KByte per file. In one case I tried to add about 13,000 files and the process fell over when it reached 1.3 GByte having added only 10% of the files. So this approach doesn't work for version 1.6.1. All indications are that this is a memory leak. It is proportional to the number of files you add and try to commit. OS tools show how the process grows in size. It never shrinks (unless the process terminates one way or the other). Admittedly I didn't use a memory profiler, but what do you think the issue is when an error is reported on the command line saying "out of memory"? The better option for getting large sets of files into a repository - and that's what I learned by now - is to use SVN's import functionality. I have made several tests now with up to 65,000 files in one batch (several hundred MB) and the memory consumption of the client process grew only very slowly - from about 10 MB to 58 MB at most. This growth - I suspect - is probably related to memory fragmentation but definitely within acceptable limits. So the recommendation is: Don't use svn add and then svn commit. Instead use svn import if you have large sets of files to import. If they go into the wrong place you can always move them later using a repository browser such as the one that comes as part of TortoiseSVN.

Saturday, May 09, 2009

Handling of middle mouse click in Firefox 3.0

Just had a "experience" of the undesired kind. Normally when you use the middle mouse button on a link in Firefox, it opens the page in a new tab. Well, when you use the middle mouse button on the tab (even when not hitting the little close button) then it closes the tab! Not quite what I expected. I thought it would take the link of what is displayed in that tab and open a new tab with that link. I just lost quite a lot of work that I just had entered into a web form. :-( Oh, well...

Friday, May 08, 2009

SVN Client: Out Of Memory

With the client SVN 1.6.1 (32 bit on Windows) I ran into an issue today. I tried to commit 13,000 files with about 100 MBytes of size. Most of them are text files and just a very small number of them are binary files. No matter what, this commit didn't work. I actually had to take small portions of it and commit one portion at a time. Admittedly this is not a typical change set, and so I don't want to complain too much. There is one observation, though, that made me think: When the client starts to send the content (file data) then the memory consumption goes up. I'm not sure why that is because the file is still available locally (since we are uploading the file) and hence it's not quite clear to me why the consumption needs to go up. The client sends one file at a time anyways. If a buffer is used to get the data into the appropriate format for the wire transmission I can understand that. But can't that buffer be reused once one file has been transmitted and the next one is started? Maybe I'm overlooking something about the inner workings of Subversion. In that case please comment on this post. Otherwise I have that gut feeling that there might be a memory leak in the current implementation (SVN client 1.6.1)? --- An update: There are a few open bugs related to memory leaks, even one with regards to committing large number of added files. That one is from 2004 and still open.