J is merely the notation given to a S wave in the inner core of the Earth.
1222 posts • joined 8 Jul 2009
J is merely the notation given to a S wave in the inner core of the Earth.
Nice observation, Sherlock.
Running gdb as an engine in the IDE or even TUI mode (Ctrl-X Ctrl-A) no good?
At the previous company I worked at most developers (myself included) were working on Windows with Visual Studio. The few who did work full time in Linux didn't use a specific IDE, so there were no projects setup. I only ventured into Linux when there was a bug to fix that nobody else could figure out.
I found TUI mode to be hit and miss at the best of times.
Prior to that, I was working on an extremely old code base on extremely old versions of Solaris which was running even older versions of pretty much everything. Not an IDE in sight, and the version of GDB didn't even support TUI. (Or if it did, it didn't work. It was a while ago now; I forget all the details.)
These days I get to play mostly with Visual Studio (as I have almost daily for the last 22 years or so), Xcode (the only IDE where tabbed documents were added as an afterthought), and Android Studio (which only allows debugging native code on Wednesdays in months that don't have an R in their name).
The limited byte range is an x86 hardware breakpoint limitation. That should be easy enough to work around though, so I'm not sure why Visual Studio doesn't do that out of the box. (You can change the debug engine in Visual Studio so it's not impossible - WinDBG supports large data breakpoints, and integrates with Visual Studio, apparently.)
Setting watches could be a bit easier, but overall I find the watch/variables windows, especially when using custom data visualisers pretty useful. To be honest, the biggest gripe I have with these is the shitty two-column tree view they use. I more often than not just hover over variables in the editor window and traverse the tree in the popup. (I noticed that the latest version supports right-clicking on any variable and setting a data breakpoint on that variable immediately, which is nice.)
The problem I have with GDB is its horrible (or really complete lack of) UI. Anything where I spend more time Googling how to do something than actually doing it doesn't get me overly excited. Every time I have to use GDB I usually end up wishing I could just see the fucking code I'm stepping through.
I think the real power of Visual Studio starts to shine when you're debugging multiple projects concurrently. It's so much easier these days to be able to set breakpoints in both a client and a server, for example, and seamlessly switch between them. Gone are the days of having to run two instances simultaneously and try to synchronise them manually. Ditto for debugging browser/web server systems in the same manner. That and the fact that debugging crash dumps just works - especially if you have symbol and source servers connected to your build system.
That said, it's definitely not the perfect IDE. While the feature list has continued to rise, I think the overall quality of Visual Studio has dropped considerably in the last decade. The fact that it's still better than most other IDEs out there really doesn't say much about the competition though. :)
Of all the many things that I find lacking in VS, the debugger tops the list.
What features from other debuggers are you missing?
Edit: I won't say the VS debugger is perfect, but compared to the other debuggers I have to regularly use (Xcode and Android Studio - which is basically Jetbrains, and, on occasion, GDB) it's light years ahead.
You've got that the wrong way around.
Windows 10 is no less stable than Windows 7, and is more stable than Mac OS. (In my experience.)
Also, if you find the flavour of OS you use affects your productivity, then you're using the wrong tools. (Mostly - I still haven't found any non-Windows IDE that comes close to Visual Studio in terms of features or ease of use, especially in the debugger department.)
Clearly money well spent if you're still talking about it.
Clearly futuristic air taxis just need an auto-vaporise-on-failure feature. This will solve the potentially costly hospital bills of any passengers, and remove the possibility of someone on the ground being crushed by a falling vehicle.
There are plenty of interface definition languages/systems out there today, and they're all pretty much shit.
I'm guessing (no, of course I haven't checked out the code to see for myself) this basically automates the whole thing by storing interfaces and call information in a side channel of data in whatever library/executable you're producing, and uses that data to transform calls from other languages at runtime (or maybe even at compile time if you have that option) so that they just work.
Which sounds pretty cool if you ask me.
And would be a hell of a lot better than either implementing a C-based public API if you're a library writer, or dictating the language users of your code must use, or dicking around with your least hated IDL/RPC/RollingYourOwnShit to get processes talking to each other.
Presumably the big pay off here would be having OS APIs (or any library really) immediately available in any language that the system supports without all the tedious fucking around implementing support. You'll basically end up with language choice being pretty much irrelevant when it comes to writing and releasing code.
How is yours so flaky?
So, for the first few years of owning my mac, I didn't really use it for much. I needed xcode to publish iOS builds of the game I released many years ago, and that was about it.
Some time after that I realised I could set up Apache and use it as an internal server for web development and testing, which I (slowly and painfully) did. That was working fine for a while. Until I updated macOS. That update effectively destroyed all my custom network settings and the entire Apache configuration. Now, it's entirely likely that I didn't set things up "correctly". But in all honesty, if macOS ships with something like Apache, then it should ship with a fucking configuration tool, so I can't set it up wrong. And if it configuration files have changed, maybe it should not blindly fucking overwrite them when updating. Apple love to harp on about how user friendly their machines are, but to be honest that's been a complete crock of shit for at least 20 years now. (I used OS6 (or 7, I don't remember) for quite a while back in the day and absolutely loved it.)
A second update destroyed the vmWare setup I was using. Now this was basically due to vmWare stopping support for the product I had bought (it wasn't more than two years old) and Apple not giving a rat's shit about backwards compatibility. Suffice to say I will never buy anything from vmWare ever again.
After losing the Apache configuration I decided to invest in something that could manage that shit for me, so I bought MAMP Pro (iirc) and set about rebuilding the test configuration for my web development setup. That worked fine until, once again, I updated macOS and, once again, that system was hosed. The version of MAMP Pro that I bought doesn't really work with the latest macOS (hello, backwards compatibility) and I mostly haven't bothered trying to figure out why - it sort of works, but mostly doesn't for no obvious reason that I can see. I can't be arsed to to figure out why, and I don't really need it these days anyway.
Now that I can't update macOS, my copy of xcode will very shortly be too old to support the features our app developers are using, and at that point I will throw my mac in the garbage where it belongs. (And pray work doesn't buy me a new one.)
I think that brings us up-to-date. Who needs a drink?
If it was an Apple product, updating would basically hose everything and reset all settings to factory defaults. I've had that happen more than once when installing a MacOSX update. Thankfully my iMac is "too old" to install the latest version of MacOS.
Yeah and then the CPU optimizer kills your loop dead because it just doesn't do anything
So make it do something. Personally I think instead of hardcoding the two values to be added, I'd just call rand a couple of times and shift/xor those values to the total, and then just store the total (which is meaningless) alongside each real seed value.
As long as they're stored outside of this function and accessed somewhere else, the optimiser should leave those values and the code that generates them alone. Tricking optimisers isn't usually that difficult. (The hard part is tricking them into doing what you want them to do.)
I presume you both meant до свида́ния.
An open llan office full of people wittering at their computers and a good percentage of those conversatioms descending into screaming fits?
Exactly how is that different from any open plan office today?
The upcoming AMD TR2 with 64 cores should equal that if it is a 250W part.
The TR2 is a 32 core chip with 64 hardware threads.
Well there are 32 of them in there. So about 3.9W each.
I have a Lumia 950 too. And a Huawei P20 Pro that work bought for me. The Lumia is my everyday phone still, except for when I need to go to the big smoke, in which case I'll take the Android because I need it to buy my transport tickets (because the fucking bus company refused to ask their payment service provider to fix their Mobile Edge support).
If my 950 dies (and I've dropped it on to concrete enough that I'm almost convinced it's indestructible) I'll switch to the Android for now, and then pick up an Andromeda device when they finally ship.
Technically speaking, Windows 10 Mobile is not the same as Windows Phone.
If the fucking press would stop confusing the two, more developers might make the effort to put out apps that support both mobile and desktop (and xbox 360/one and hololens and hub).
Amazon also have thousands of employees working in slave-like conditions, earning such piss poor wages that they also qualify for and rely on welfare handouts, while Amazon are avoiding paying as much tax as possible and claiming any and all the government grants they can.
As much as I feel that Microsoft still has a way to go with regards to Azure stability, they're getting there and I'm willing to put up with things like this if it means I can happily tell Amazon to go fuck themselves.
1) 2010 does not require additional runtimes (i.e the 2017 C/C++ re-distributable) to be installed.
All versions of Visual Studio (and Developer Studio before it) support static linking of the runtime libraries. 2010 isn't special in that regard. It might not be available in the free editions, but it's always been there in paid versions.
2) 2010 with the Intel C/C++ compiler installed gives a slightly newer standard (C++11) which can keep the 2010 IDE going for another decade ;)
The Intel Compiler is a very good optimiser, but it's not exactly up-to-date these days. And it doesn't really help you if you're targeting Windows on non-x86 platforms.
4) Worst case scenario, to keep with the 2010 IDE, just install clang which has great integration (and newer standards support).
Actually, the worst case scenario would be having to use Xcode (which goes out of its way to make things harder and more time consuming) or Android Studio (which takes a fucking lifetime to build anything and rolls a dice at midnight to decide whether it will let you debug your application for the next 24 hours).
VS and Window's header files just clog up everything - haven't they heard of namespaces and keeping things compartmentalised.
You mean how Linux does? Or maybe you're talking about iOS or macOS? Yeah, I sure do miss all those namespaces when I'm working with the Windows API.
If anyone's interested, this made most of the horribleness disappear:
border: #000000 0px solid;
margin-bottom: 10px !important;
display: none !important;
PS: It'd be nice if every line in a comment wasn't given it's own P, or if you'd fix PRE in comments.
Top Stories - don't care.
Most Read - don't care.
Make the above collapsible (and remember my settings).
The rest I can fix with Stylus.
...what percentage of your readers opt out.
Your Phone does not support MMS messages, although with the costs inflicted by carriers on users trying to use the service, this is probably no bad thing.
What costs? Are there any cell phone service providers who actually charge for anything other than data or special services these days? (If yours does, you might want to consider changing...) ;)
My only concern now is now that rocket science is easy, what are we going to use to explain how much easier something else is.
It's not...? (Answers on a postcard to the usual address.)
It sounded to me like a over-paid exec discovered that he was expected to do some real work for a change.
A little update... once the affected machines were back up and running, according to Microsoft, there were 123 million builds queued in that data centre.
Our Azure services are hosted in Europe, so in theory haven't been affected by this outage.
For some unexplained reason though, our hosted Mac build servers are in the US. And yep, they've been down all day.
Multiple copies is reasonable, but gets old when you're dealing with multi-gigabyte git repositories.
Yeah, our repositories are fairly small and won't be growing significantly in the future.
To be honest, the last time I tried to use git with a large repository (20GB+) it never got more than 30% into building the index before crashing. We tried a few other systems too, and eventually just bought some Perforce licences which took it all in its stride. I've also been using Perforce at home for well over a decade now, and wouldn't even consider using anything else.
For example only recently Git understood you may need to have different branches of a repo at the same time on your local disk, and their solution is still a just workaround.
Is there even a real solution? I just always had (and still have) multiple copies of whatever repository I'm working with.
To be honest, I was surprised to find (when writing my original comment) that Firefox isn't based on Chromium. It's such a blatant rip off of the UI (and much poorer as a result) that I thought they'd ditched all their internals like Opera did - which was basically the browser Firefox used to rip off all the time, so that would kinda make a fucked up sort of sense.
One where you have choice.
Apart from a superficial layer, there's not exactly much choice though, is there?
Chrome = Chromium.
Vivaldi = Chromium.
Opera = Chromium.
Firefox looks and feels like Chromium (and probably contains a lot of its internals these days).
Safari (Doesn't exist on non-Apple platforms anymore.)
That was my first thought too. It is a real word though.
I still think Mathematicians should stick to numbers and letters from the arse end of the alphabet though.
I wouldn't say I'm an anything fan these days. I tried using Unreal 3 back in the day, but that was just horrible. The last game I released was with Unity, but that was around six years ago now. Never tried CryEngine, but I have to admit I was always impressed with the way it looks.
Generally speaking though, I'm not talking about the results that can be achieved - all engines are pretty much the same in that regard. And from an artist/designer perspective, I doubt there's much in the editor experience either.
I'm talking about the very specific engineering quality of the Unreal Engine code. Partly from having had to listen to friends who are still in the games industry bitch and complain instead of getting on and drinking, and partly from my own experiences with UE3 (which was seriously shit in so many ways - and we had a source code licence, so I wasn't just limited to their script engine which as far as I could tell was written by a three legged, blind dog who had heard about language design from a wedding reception DJ he got chatting to in a queue in a 7-11 late one evening).
To give an example, this was a scripting language where something as basic as declaring and initialising a local variable couldn't be done in the same statement.
Epic's security it not first class as their expertise is game tech.
That made me laugh. If you had first hand experience with their game "tech", you wouldn't consider it first class. The Unreal Engine is the Ryan Air of game engines - it may get you where you want to go, but you'll feel dirty, bruised, and abused when you get there.
Doesn't that mean that anyone sitting between your browser and the server can inject whatever they like into the pages served by the server before they reach your browser? Just because a site doesn't need a secure connection to serve its pages, that doesn't mean you should assume you're safe when you visit it.
The Nigerian Prince edition?
If you look at their track record of failed projects/products, they're not a very good at anything company.
The only thing they (currently) have going in their favour is a almost bottomless money pit they can throw behind anything they like. Until the team responsible gets bored, the key people are poached by a competitor, or someone higher up has a "better" idea.
Take away their money pit and Google would be a rather mediocre software company that would more than likely disappear into obscurity within a couple of years.
...beating Microsoft Visual C++ at optimising...
...is a pretty easy thing to do. Write a semi-complicated function and the optimiser generally gives up halfway through. At least it used to. Even when it manages to make it to the end of the function, I never found myself suitably impressed with its results. (To be fair I haven't really looked too much into the code it generates these days.)
The Intel compiler on the other hand... That is and always has been a serious optimiser when it wants to be. The things I've seen it do to code I wouldn't want to talk about in polite conversation.
Be insured it makes no sense at all.
Given the general level of grammar and spelling on the Internet these days, I really don't know if that was an extremely clever comment or a monumental fuck up. :) I'll give you the benefit of the doubt this time, but next time remember to end your sentence with a smiley face and/or use an appropriate icon. ;)
And what about multiple accounts for each actual human?
I find that less likely than Facebook counting each *device* individually. I have at least four different devices that I use depending on where I am and what I'm doing.
Yes, but nowhere important.
Is that the aim of Brexit then? To make the UK as important as everywhere else that drives on the left.
...which is the safest in the world?
Yeah, it's scary how many safety features are in those things. But not as scary as how many are missing from other power sockets/plugs outside of the UK. I'm constantly amazed that the rest of the world hasn't burned down from electrical fires by now.
Nah, XP was pure Disney.
The default desktop bears a striking resemblance to Windows 10, replete with a familiar-looking start menu.
Err... yeah, right.
It looks pretty much how I would expect Windows to look if it had been designed by Fisherprice in the 80s.
Just export your settings somewhere and reimport them on the few occasions Visual Studio resets them. I always have a backup handy ever since I realised how terrible an idea auto-syncing from the cloud is.
I've dealt with my share of lazy developers, but I'll bet that most devs understand a fair bit about optimisation, and I'll bet a lot of them even care.
Perhaps lazy wasn't the right word. But in all honesty, I don't know many developers who really understand shit about optimisation these days. Certainly there isn't the same understanding about what a compiler does to your code that there used to be. And as more and more higher level languages become more and more popular, people really don't understand the implications of what happens when they write something one way instead of another.
Back when I was working at [a rather large game developer] not even 10 years ago there was a pervasive sentiment among most of the developers* that they write code, and someone else will optimise it. And to be honest, I haven't seen any differences in attitudes at other places I've worked since.
While it's nice to think people do actually care, in reality most of them don't. The tsunami of shitty Android and iOS apps eating your battery is testament to that.
* There were a couple of *really* good developers there who basically spent every stand up meeting trying to drill into everyone else: Use less memory; Profile your code. (This was for a last-gen console game.) I even ended up writing a memory profiler** to help show people what the consequences of their actions were.
** I've written another once since. If you ask nicely, I'll share a link. ;) Not going to spam it at you right away.
In all fairness, a lot of resource requirements these days come from lazy developers who don't understand shit about optimisations.
I only have 16GB of RAM in this machine, and since I switched to Vivaldi (Chromium) as my primary web browser (which seems to require two to three times the amount of memory per page than an older version of Firefox), I find myself running out of memory more often. Ditto for SmartGit (hello, Java) which uses a laughably high amount of memory for what it does.
And look at how much storage space the average application needs these days. It's insane.
There are some applications that obviously need a lot of memory and storage space - databases and video editing are two that spring to mine - but there's a hell of a lot of waste these days. Won't developers think of the children/environment [delete as applicable]?
</oldman> <-- Does this mean I'm dead now?
Biting the hand that feeds IT © 1998–2018