Visual Studio: Why is there no 64 bit version? (yet)

Rico Mariani
7 min readApr 8, 2021

UPDATE! Visual Studio goes 64 bit in VS2022!!

This was one of my most contentious postings ever. I dug up an archived version because the original was deleted from MSDN. Many of my posting are archived there (e.g. Development Tools Ecosystem Summit | Microsoft Docs). But it seems some of my articles were censored. I’m kind of sad about that, this is an important article that reflects real thinking at the time.

So keeping in mind this is quite old… here it is for reference in its entirety. I recovered it from Wayback Machine (archive.org) at Visual Studio: Why is there no 64 bit version? (yet) — Rico Mariani’s Performance Tidbits — Site Home — MSDN Blogs (archive.org).

Interestingly, I was often asked “if not now then when?” and my standard response was “probably not within five years”. And I was sure I included that in this article but I guess I chickened out of being specific. I might have put it in the comments but they are lost. So, that was 2009… it’s now 2021 :D

Edit after re-read: where I wrote “my feeling is that the best place to run VS for this generation” — total weasel-words from me there. That’s definitely me not wanting to say 5 years in this article. By “this generation” I meant, “the life of the shell that hosts the extensions”. I had hoped we would do a 64 bit shell soonish but I don’t think came to pass.

I wrote a follow-up years later but frankly it wasn’t nearly as good and I wrote it on a bad hair day so I’m not sad that I couldn’t find it.

ricom

10 Jun 2009 11:34 PM

Disclaimer: This is yet another of my trademarked “approximately correct” discussions

From time to time customers or partners ask me about our plans to create a 64 bit version of Visual Studio. When is it coming? Why aren’t we making it a priority? Haven’t we noticed that 64 bit PC’s are very popular? Things like that. We just had an internal discussion about “the 64 bit issue” and so I thought I would elaborate a bit on that discussion for the blog-o-sphere.

So why not 64 bit right away?

Well, there are several concerns with such an endeavor.

First, from a performance perspective the pointers get larger, so data structures get larger, and the processor cache stays the same size. That basically results in a raw speed hit (your mileage may vary). So you start in a hole and you have to dig yourself out of that hole by using the extra memory above 4G to your advantage. In Visual Studio this can happen in some large solutions but I think a preferable thing to do is to just use less memory in the first place. Many of VS’s algorithms are amenable to this. Here’s an old article that discusses the performance issues at some length: http://blogs.msdn.com/joshwil/archive/2006/07/18/670090.aspx

Secondly, from a cost perspective, probably the shortest path to porting Visual Studio to 64 bit is to port most of it to managed code incrementally and then port the rest. The cost of a full port of that much native code is going to be quite high and of course all known extensions would break and we’d basically have to create a 64 bit ecosystem pretty much like you do for drivers. Ouch.

[Clarification 6/11/09: The issue is this: If all you wanted to do was move the code to 64 bit then yes the shortest path is to do a direct port. But that’s never the case. In practice porting has an opportunity cost, it competes with other desires. So what happens is more like this: you get teams that have C++ code written for 32 bits and they say “I want to write feature X, if I port to managed I can do feature X plus other things more easily, that seems like a good investment” so they go to managed code for other reasons. But now they also have a path to 64 bit. What’s happening in practice is that more and more of the Visual Studio is becoming managed for reasons unrelated to bitness. Hence a sort of net-hybrid porting strategy over time.]

So, all things considered, my feeling is that the best place to run VS for this generation is in the 32 bit emulation mode of a 64 bit operating system; this doubles your available address space without taking the data-space hit and it gives you extra benefits associated with that 64 bit OS. More on those benefits later.

Having said that, I know there are customers that would benefit from a 64 bit version but I actually think that amount of effort would be better spent in reducing the memory footprint of the IDE’s existing structures rather than doing a port. There are many tradeoffs here and the opportunity cost of the port is high.

Is it expensive because the code is old and of poor quality?

It’s not really about the quality of the code — a lot of it is only a few releases old — as it is the amount of code involved. Visual Studio is huge and most of its packages wouldn’t benefit from 64 bit addressing but nearly all of it would benefit from using more lazy algorithms — the tendency to load too much about the current solution is a general problem which results in slowness even when there is enough memory to do the necessary work. Adding more memory to facilitate doing even more work that we shouldn’t be doing in the first place tends to incent the wrong behavior. I want to load less, not more.

Doesn’t being a 64 bit application save you all kinds of page faults and so forth?

A 64 bit address space for the process isn’t going to help you with page faults except in maybe indirect ways, and it will definitely hurt you in direct ways because your data is bigger. In contrast a 64 bit operating system could help you a lot! If you’re running as a 32 bit app on a 64 bit OS then you get all of the 4G address space and all of that could be backed by physical memory (if you have the RAM) even without you using 64 bit pointers yourself. You’ll see potentially huge improvements related to the size of the disk cache (not in your address space) and the fact that your working set won’t need to be eroded in favor of other processes as much. Transient components and data (like C++ compilers and their big .pch files) stay cached in physical memory, but not in your address space. 32 bit processes accrue all these benefits just as surely as 64 bit ones.

In fact, the only direct benefit you get from having more address space for your process is that you can allocate more total memory, but if we’re talking about scenarios that already fit in 4G then making the pointers bigger could cause them to not fit and certainly will make them take more memory, never less. If you don’t have abundant memory that growth might make you page, and even if you do have the memory it will certainly make you miss the cache more often. Remember the cache size does not grow in 64 bit mode but your data structures do. Where you might get savings is if the bigger address space allowed you to have less fragmentation and more sharing. But Vista+ auto-relocates images efficiently anyway for other reasons so this is less of a win. You might also get benefits if the 64 bit instruction set is especially good for your application (e.g. if you do a ton of 64 bit math)

So, the only way you’re going to see serious benefits is if you have scenarios that simply will not fit into 4G at all. But, in Visual Studio anyway, when we don’t fit into 4G of memory I have never once found myself thinking “wow, System X needs more address space” I always think “wow, System X needs to go on a diet.”

Your mileage may vary and you can of course imagine certain VS packages (such as a hypothetical data analytics debugging system) that might require staggering amounts of memory but those should be handled as special cases. And it is possible for us to do a hybrid plan with including some 64 bit slave processes.

I do think we might seem less cool because we’re 32 bit only but I think the right way to fight that battle is with good information, and a great product.

Then why did Office make the decision to go 64 bit?

This section is entirely recreational speculation because I didn’t ask them (though frankly I should). But I think I can guess why. Maybe a kind reader can tell me how wrong I am :)

First, some of the hardest porting issues aren’t about getting the code to run properly but are about making sure that the file formats the new code generates remain compatible with previous (and future) versions of those formats. Remember, the ported code now thinks it has 64 bit offsets in some data structures. That compatibility could be expensive to achieve because these things find their way into subtle places — potentially any binary file format could have pointer-size issues. However, Office already did a pass on all its file formats to standardize them on compressed XML, so they cannot possibly have embedded pointers anymore. That’s a nice cost saver on the road to 64 bit products.

Secondly, on the benefit side, there are customers out there that would love to load enormous datasets into Excel or Access and process them interactively. Now in Visual Studio I can look you in the face and say “even if your solution has more than 4G of files I shouldn’t have to load it all for you to build and refactor it” but that’s a much harder argument to make for say Excel.

In Visual Studio if you needed to do a new feature like debugging of a giant analytics system that used a lot of memory I would say “make that analytics debugging package 64 bit, the rest can stay the way they are” but porting say half of Excel to 64 bits isn’t exactly practical.

So the Office folks have different motivations and costs and therefore came to different conclusions — the above are just my personal uninformed guesses as to why that might be the case.

One thing is for sure though: I definitely think that the benefits of the 64 bit operating system are huge for everyone. Even if it was nothing more than using all that extra memory as a giant disk cache, just that can be fabulous, and you get a lot more than that!

--

--

Rico Mariani

I’m an Architect at Microsoft; I specialize in software performance engineering and programming tools.