"Thirty-two bits equates to 232 memory addresses"
Erm, I think you missed formatting the 32 as superscript, or missed out the ^ symbol :-) Otherwise 232 memory addresses seems quite low!
There's no question: 64-bit computing is here to stay - and it seems set to be the future of computing. But is it an essential element of your next desktop refresh cycle? Chances are it will be. The struggle to reach a stable 32-bit platform may have escaped you but it took many years before we were finally rid of slow 16- …
"Thirty-two bits equates to 232 memory addresses"
Erm, I think you missed formatting the 32 as superscript, or missed out the ^ symbol :-) Otherwise 232 memory addresses seems quite low!
Unless you're running an MK14, in which case it should just about do.
Unless you want to run the moon landing game - I think that took almost all ( maybe 248 bytes ?)
We've had 64-bit processors in the home for the best part of a decade. The only thing holding us back was the prevalence of systems being shipped with 32-bit Windows. If we'd had the 64-bit push with Windows Vista then we'd all be happily chugging along in 64-bit already. After all, it's not like driver support was great to begin with on Vista.
Thank goodness most modern systems are shipping with 64-bit Windows 7.
Our company recently took delivery of a handful of Dell laptops which have in them 64-bit processors and 4GB of RAM. They came pre-installed with Windows 7 Home Premium 32-bit.
Am I missing something, or are they completely mis-selling these machines?
No further insults are necessary for those who selected the equipments and those who supervised the purchase.
If they're like most of the Dell and Lenovo systems floating around here, they might have 64 bit CPUs, but they cap out at 4GB RAM. If you;re not using over 4GB, there's very little point to a 64bit OS. (in some computationally impressive tasks, it can make a difference, even with limited RAM, but i can probably count the scenarios for such being done on a laptop on one hand).
You should mention that your article only applies to Windows. Linux doesn't suffer the same limitations. 32-bit on linux does have limitations, but they are far less limiting than the ones imposed by MS.
I neither know nor care what the situation is for Mac...
This limitation is specific to Intel & compatible 32 bit architectures irrespective of the OS. Compared to Windows, Linux is treating the matter somewhat differently but it definitely has the same limitation of 4GB memory space per process. If Mac runs on 32 bit Intel then it must be subjected to the same treatment.
I know this very well because I tried hard to explain to some thick upper management dude why this 4GB limit exists even his server told him it has 16GB of RAM installed.
32bit Macs have a maximum limit of 32GB addressable RAM because OS X uses Physical Address Extension. (PAE). Apple has been selling 32bit OS-es on systems with more than 4GB physical RAM quite some time before they introduced a 64bit OS. All thanks to PAE.
Actually, you should have read a bit more. 64 bit capable hardware has been the _default_, not the exception, for _at_least_ 4 years for Intel and AMD based platforms. I've been running an amd64 version of Ubuntu since 7.04, and I was a little late to the party. 64 bit drivers for Linux have been trivially easy to find. Even most of the applications simply required a recompile with the 64 bit option turned on. The only real hangup was Adobe. Flash took forever to perform correctly in a 64 bit version of Firefox.
BTW, the fact that Win7 has an artificial cap at 192 GB? FAIL. You'd think that after Dos, Win95, and WinXP they would have finally learned not to do that. (Or have they been doing intentionally all along to help force us to upgrade? Inquiring minds want to know! ;-) )
Your comment reminded me of an article that Mark Russinovich wrote. Basically 32-bit Windows isn't actually physically limited to 4GB of RAM and can overcome it with PAE too (and originally XP would allow this). The reason why there's a limit now (Microsoft removed PAE support and won't use it in Vista or 7) is because the majority of drivers weren't written to cope with a more-than-4GB-addressable-memory that PAE allowed, leading to unacceptable crashes. Therefore, they simply erred on the side of caution and (for consumer versions of 32-bit Windows at least) they limited it to less than 4GB.
With PAE, a 32-bit operating system can access up to 64GB RAM.
Normal 32-bit processes can only access a 4GB address space, but each process can have it's own 4GB on a PAE computer (this is the same as with a 64-bit computer - Windows x64 gives each process it's own 4GB 32-bit VM) so if you multi-task enough, you can make use of >4GB RAM.
However, a process can use AWE in Windows to access the memory beyond the 4GB limit, as long as it's prepared to do its own memory management.
Linux and Mac OS X enable PAE on just about all computers; Windows only does if you are running Enterprise Edition Server or Datacenter Edition Server - neither Windows Client, nor Windows Standard Edition Server will support PAE beyond the 4GB limit.
Microsoft say that (a) this was a licensing requirement and (b) they found quite a few drivers that blue-screened Windows if there was more than 4GB addressable.
It still keeps a single app/process limited to 4GB, and the OS limited inside 3GB. For a server, heavy app like Photoshop, video editing, etc, it's no good. For a home user who simply keeps a ton of crap open, but little of which uses 2GB or more each, it's passable. PAE was designed for workstation folk to be able to handle multitasking while still waiting for the hardware to become 64bit, and it got used in things like small office servers that ran lots of tasks on a single box, but it;s no substitute for native 64bit support, and it is a resource drain (small, but notable) as well.
My main system is Win7 dual boot between 32- and 64-bit. I find I spend 99% of my time in 32-bit mode. This is because:
* some older peripherals don't have 64-bit drivers
* I've never seen memory utilisation above 60% on a 3GB system
I suppose if multi-GB files are a requirement for you - but I refuse to believe an XL file can get that big, and RAW image files are a few 10s of MB. Video editing, maybe?
Windows tends to disk swap aggressively, so memory utilisation may well be 60% or less, it just means you've got stuff swapped to disk too.
I frequently get close to my physical memory limit, and I have 8GB. A quick glance at my task manager now tells me I'm using 4.68GB, and I'm not doing anything unusual. But then I suppose not everyone needs to run multiple copies of gearbox modelling software on a day-to-day basis...
As difficult as it is to believe, Windows is quite intelligent when it comes to memory utilisation. It will normally aim to keep as much as it needs (or thinks it needs, or will need in the future) in the highest level of cache possible. This includes processor registers, L2 cache where available, RAM and finally the HD swap file. It'll use as much as it can, though will always leave a gap if it can for the "ooh I didn't think of that" moments.
It's not about memory utilisation, but the responsiveness of your machine, regardless of whether you're editing video or working on a 100K Excel document. The less RAM you have, the less carefree Windows can be in keeping page files off the HD.
With the prevalence of 720p and higher camcorders, the need for more than 3GB of RAM for a single app is upon us. Even photo-shop can peg a 32bit system pretty easy. I have scans that are over 1GB, I;ve seen a 2GB PDF for crying outnloud... PST files can easily climb above 4GB.
Memory utilization is misleading, as it does not take into account system processes to guarantee free memory. Give a 2Gb system that was 30% free ream another 2GB of RAM and it usually still will have 30% free, having used the other 2GB it had. Win7 especially will attempt to use all you can throw at it, and the performance improvement is notable from 2-4GB, and even to 6 on most systems (especially is slower HDDs are in use). going past 8GB has little impact, but going past 4Gb does.
Also, under win 7, ANYTHING compatible with any Win7 version has to be certified for ALL of them, 32 and 64bit, so the only 32bit only drivers you shoudl be finding are legacy systems. (and SOME of them still work, typically any that were at least Vista compatible).
Other then a few old HP printers which where fixed by using a driver for a different printer and an old web cam which, to be fair, can be replaced at the price of things these days, I have no problems with full 64bit OS and drivers. Many many apps are still in 32bit, but those that are it usually doesn't matter (photoshop etc are 64bit now). So things have improved a lot in the last 2 years I'd say from my experiance.
Correct me if I'm, wrong, but the 4GB limit only exists if your not running PAE which many OS's have had for a while. You can only go up to 4GB in windows desktop OS's but Windows server has had it for a while. Linux has it in the desktop though, obviouslly
Recently added a default 64-bit Windows 7 Ultimate preinstalled on a Dell Vostro laptop to a bog standard W2003 domain running a Sharp PCL b/w & Oki colour laser P/S printer with a 32-bit well known proprietry dental software suite.
No 64 bit drivers for the printers. Not good and wasn't going through the hastle of installing unsigned 32 bit, went for generic PCL/ PS instead.
The Imagelevel imaging software bombed out at install & the well known dental software wouldn't work -it ran out of 32 bit handles, whatever they are - during the install, kicking me out in the process.
AAAARGH! Call to the helpdesk - no 64 bit s/w available from wellknowndentalsoft & co.
Had to rebuild using W7 Enterprise 32-bit from Technet then getting a licence from M$, all works.
I've never noticed any speed increase at all with 64bit.
Totally pointless to the end user really.
The (default) 64bit version of Firefox on OS X has very impressive speed improvements over the (legacy) 32bit version.
While 64 bit AMD64 (or whatever intel calls it, remember AMD did it first) is faster than IA32 code (mainly because compilers get to use the larger visible register file that AMD64 specifies) this is not always true on other architectures.
On the PPC for example, for a number of reasons, code that can be restricted and compiled to 32 bits can be actually faster than 64 bit code. But in my experience this difference is often not as drastic as the difference between AMD64 code and IA32 code. In the case of x86, compilers just really seem to love the extra registers, and this sort of more than offsets the slightly increased overhead of the slightly larger code and data.
Why you are not seeing much difference doowles might be the result of i/o bottleneck ie hard disk/network and/or also possibly swap. Just a guess, for this is by far the biggest bottleneck I feel I have in my day to day use. That is unfortunately a problem that going to 64 bits will not solve, if at all, may exacerbate due to slightly larger data/executables.
I personally have grown to love SSD's - they help somewhat. But my god, are they costly....
That explains why ARM is concentrating on 48-bit memory space rather than just doubling to 64-bits.
where does OS X sit in this?
I know the OS itself is (at least mostly) 64bit, and many of the apps that I run show as '64-bit intel', what happens to the system as a whole when a 32-bit app runs?
In 10,6, you are given the choice of 2 kernels...
Well, you may not have known you had the option, for leopard will default to a 32 bit one for general use, and a 64 bit one for the more bleeding edge which you had to deliberately choose to boot. I believe that the first true 64 bit kernel was available was released with leopard. (My machines still run 10,5, so I've never run it).
I believe (but do not know for sure) most people still run a 32 bit kernel.. And this is because of the issue of the wider availability of 32 bit drivers.
I don't know if the snow leopard default kernel has been moved to 64 bit for I have not upgraded nor have I bought a new mac ( nor will I, jobs is nasty, down with apple yadayadayada that sort of thing). My guess is snow leopard still defaults to a 32 bit kernel on boot.
So, what happens when you run a 32 bit app? Not much more. Your kernel is probably 32 bit. When you run a 64 bit app in userland, you likely context switch to a 32 bit kernel.
I would imagine that the context switching that occurs is.. not very elegant but I can see why they have to date done it this way.
So most people running OS X by default are in fact not running a true 64 bit OS though in fairness it supports a 64 bit userland so really, practically speaking, for most people out there, it's not worth quibbling about at least for now.
...do I need 64-bit beasts on my desktop if I have virtualised said desktop? From the past few days on El Reg it seemed like everyone and their donkey had a hankerig for virtualised desktops, so I am a bit puzzled why the 64-bit question even arises.
Outside of a few niche uses (e.g. video editing, large image manipulation and maybe some devs) who really needs more than 4gb chuntering along on their dekstop?
If you can get 64bit for the same price why not choose 64? And as for the desktop virt setup, the server side is certainly 64bit so why not choose to boot a 64bit kernel on the client just to avoid complications?
Why is this still an issue FFS?!
Reading about 64bit desktop computing as being "new" gets pretty depressing for those of us who have had 64bit Unix workstations for well over a decade.
Intel and MS thought the 64 bit Server business was going to be on the Itainium, so the 2001 Windows 64 was the Itanium edition. (In 2001, 64 bit unix was custom built to match specific hardware, so MS wasn't out of step in that regard)
Windows XP Pro 64 came out 2005, making 64 bit computing available for the masses. The reason most people haven't run 64 bit operating systems since then is -- most people didn't care and didn't need it. Even though you got both sets of installation disks for free -- 64 bit offered nothing to most people in 2005, let alone 1995.
On the other hand, there have always been those who wanted to say "my disk is 32 bits longer than your disk", and it looks like they are all showing up here.
First of all, traditionally almost all PCs have been sold with less than a decent amount of memory (remember the standard 256MB or 512MB memory also shared with video RAM ?). Now if you look around at retail PCs and especially laptops, they rarely show more than 3 or maximum 4 GB of RAM installed (all slots filled) and not a lot of people decide to throw away the exiting memory and buy new chips to fully maximize the RAM. For your info, trying to maximize RAM in a Dell laptop might add 20 - 25% to the final cost. So 32 of 64 bit we will never see a difference because the machines will always be sold with a minimum amount of memory.
Secondly, nobody ever said that in order to process files larger than 4GB requires more than 4GB or RAM. And, please excuse me, if you`re starting to work with CAD (AutoCAD at home WTF?) or Photoshop files larger than 2 or 3 GB the you've become what we call a professional and that requires professional hardware.
I've been running 64bit Vista/7/Ubuntu for the better part of 5 years now and haven't noticed any issues with drivers or support. Core2Duo's were 64bit, even the cheaper Celeron or Pentium-D's based on the same core have been out for a while. There's no reason in this day and age to still run a 32 Bit OS. and I get annoyed every time I see a 64bit capable laptop with 4GB of RAM ship with Windows7 or Vista 32bit.
I will have to say that Windows XP 64bit sucked because of the driver support. But luckily Vista was released 64bit at the get-go which pushed a lot of manufacturers to come up with 64bit drivers. 7 continues that trend and offers even better 32bit compatibility modes in case you do run into the issue.
If you're using the machine to do anything floating point intensive, 64 bit mode is pleasantly faster.
It's also much less prone to root kits due to the signed driver requirement.
bit width of the addressing has nothing to do with floating point performance.
On the x86 architecture, 64 bit mode buys you a rather different floating point architecture to that offered in 32 bit mode. Hence there's a considerable performance difference.
for the software guys to actually write full 64-bit software and we can do away with 64-bit WOW portal for 32-bit apps. This of course will no doubt mean a total re-write of some software or a recompile with some newer libs that are both. I have a good example of software written that works fine on XP / Vista / Win 7 but only if it's the 32-bit version as wait for it they first wrote it when .NET 1.1 was still okay to use and it needs some part of that to run and guess what MS never did made a 64-bit version of .NET 1.1
Oh to be a software programmer.
Mines the one with the .NET 1.1 install CD in the pocket.
1. 64 bit doubles the register space. This is actually noticeable on two counts:
1.1. There is a significant increase in speed and lower latency for OS-es/GUI frameworks that do not spend most of their time twiddling their thumbs in a UI switch statement. MacOSX, Linux+KDE get a significant boost. Linux+Gnome not so much but still noticeable.
1.2. Compilation at least for C/C++ takes one and a half the time.
2. The article is very windows centric. None of these problems exist on a MacOSX or Linux desktop. 64 bit truly shines on Linux to the point where even an archaic Athlon 64 3200+ with 2G RAM is worth putting in a 64 bit mode instead of keeping it to 32.
3. AMD specifically can change frequency/power levels faster when in 64 bit mode. So you get (though marginal) better performance on power-saved desktops and laptops as well.
The ABI is significantly improved, too: with parameters being passed in register by default, and a pre-reserved "red zone" that eliminates the need for most "leaf" subroutines to ever twiddle with the stack. Unfortunately Microsoft limited their ABI to just four register parameters, whereas everybody else allows the first 6 integers and the first 8 floats to be passed in register. (And, as noted above, 64-bit mode uses XMM registers rather than the 8087) All of which contributes to the speed improvements. (The doubling of the register file is still a huge one, though.)
Title correction: `The Microsoft 64-bit question'. Everyone else has already moved on.
I've not seen any other desktops around than 64bit for 5+ years and have not seen much specific 64bit-induced problems either. But yeah, we don't run MS Windows except on a few legacy systems (and those are 32bit).
Try buying a decently-specced consumer machine without it at the moment.. and it's very difficult. About half the decent laptops being punted in retailers seem to be 64 bit as they have 4GB of memory or over, but they're not always clearly labelled as such.
I think that right now we are seeing a sea change in what consumers are buying - with luck this should start to get more 64 bit drivers and apps out there, and that will help to make 64 bit Windows a bit more mature and corporation friendly.
I have just ordered myself a 64 bit Windows 7 machine for video and graphics work where my trusty 32 bit Windows XP machine struggles. I look forward to failing to get my apps to run on it.
First of all, addresses x64 architecture is limited to 52bit, remaining bits are used for other purposes by programmers and ignored by CPUs. Since its likely for CPus to stay backward compatible becaouse these bits used by programmers - CPus may never go beyond 52bit on current x64 architecture.
And the clear benefits are following:
1. 64bit general purpose registers, 16 registers instead of 8
2. All 64bit CPUs support SSE2. 16 registers, each 128bit wide
3. 64bit computations WERE Needed. SSE2 is limited to 64bit COMPUTATIONs as well.
Benefits are now over.
Obviously this is not enough for marketing purposes. So, read this "good article" again.
In addition, its unclear how high level compilers(C++ for instance) supports these new registers. I'd say support is not even satisfactory. Most heavy computation tasks like encryption is written in assembly language and high level compilers are not needed. Most likely programs like Photoshop or Autocad also have good chunk of assembly code.
Upgrading to x64 architecture LATER is likely to be cheaper because:
1. old CPus and other old hardware will become cheaper
2. and for instance Autocad(or any other soft) will keep file formats compatible between 32bit and 64bit version of the Autoca.. All you need is to install another x64 OS and needed soft (but you obviously have to buy it, and why not go for new and improved 2012 version).
1. Larger addresses(mostly 48bits) are a plus but only show off in intense tasks like video or graphics editors, data storage and maybe servers. But truly you can just load your files in chunks and save the RAM you have - modern 'programmers' are lazy though.
2. Device drivers do not win anything from x64 as they are IO intensive
You will need to re-qualify all your tools to ensure that the results are the same before and after.
I installed Win7 64-bit on a PC. Broadly speaking there hasn't been any hassle in doing it. Most popular apps (with the exception of Firefox) have 64-bit native clients, and 32-bit apps run just fine too. Biggest issue I have faced so far was Android's SDK didn't like my setup for some reason and I had to use Java 64-bit to rectify the issue. But when you install Java 64-bit you discover none of the JNI DLLs work any more so if you're running Eclipse and use Subversion for example you have to play "hunt the native 64-bit DLL" and it's all a bit manual and nasty.
So as an end user things are not quite smooth sailing but they're close enough. The benefits of 64-bit largely boil down to support for more memory and perhaps some marginal improvements in other ways.
My biggest beef with the whole 32-bit / 64-bit thing is why Microsoft made it such a pain in the arse. Why should a user have to choose their version? Why should devs have to build two versions of the same code? Why should companies have to QA two versions of the same code?
OS X didn't come in a 32/64-bit version. Both were one and the same and the transition was incremental and seamless. The OS saw what CPU you had and then enabled different code paths depending. Binaries were also "fat" which was a fancy way to say the same app package could run on different architectures. The funny part is watching MS announce Windows for ARM and realising any problems that came before now will look like a walk in the park compared to ARM vs x86 issues. It amazes me that MS haven't LEAPT on LLVM or similar tech so that devs nor end users really have to care what architecture they're writing against - just compile to LLVM and let the OS figure out how to run the code natively.
"Most popular apps (with the exception of Firefox) have 64-bit native clients"
I'm confused. Are you saying that Firefox is not a popular app but has a native 63-bit client? Or do you mean Firefox is a popular app, but you didn't click the "64-bit version" when you downloaded it?
Been running Firefox 64-bit here for ages. (However have to run IE8 32-bit because the 64-bit version doesn't seem to be stable for me)
According to http://support.mozilla.com/en-US/kb/Using%20Firefox%20with%20a%2064-bit%20Operating%20System?s=%2264-bit%22&as=s
"Using Firefox with a 64-bit Operating System
Firefox will work on 64-bit versions of Windows, but it is not a 64 bit application. See system requirements for details on all supported systems.
For details on how to install, see Installing Firefox on Windows.
Firefox will work on 64-bit versions of Mac OS X, but it is not a 64 bit application. See system requirements for details on supported systems.
For details on how to install, see Installing Firefox on Mac.
Firefox will work on 64-bit versions of Linux, but it is not a 64 bit application. See Installing Firefox on Linux for details on obtaining the proper version of Firefox for your Linux distribution.
Mozilla is working on providing 64-bit versions, but no schedule is set.
Contributors to this page: Chris_Ilias"
By the way, the link to Windows "System requirements" is broken. So maybe the whole lot is out of date.
Not that I'm concerned.
Definately running 64-bit firefox here:
/usr/lib/firefox-3.6.15/firefox-bin: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked ) uses shared libs), for GNU/Linux 2.6.15, stripped
I'm aware there is a 64-bit experimental build but it's not released officially yet. And doing so means plugins like Flash, Acrobat, Java, Silverlight etc don't work. And yes I want the plugins to work.
It's the usual problem of how to host 32-bit plugins in a 64-bit browser. I suppose that since Firefox can run most plugins in a separate app, maybe it's possible to have the client be native 64-bit and offer a 32-bit host / thunk exe for legacy plugins.
I expect some add-ons might also have 32-bit dependencies which would be another issue to think about.
nspluginwrapper is method to run 32-bit plugins inside a 64-bit firefox
firefox: ELF 64-bit LSB executable, x86-64,