Where did dBase come then?
I still use it. Well Harbour compiler, so maybe I should say Clipper?
It seems coders cannot get enough of Rust, according to a survey conducted by dev saviours Stack Overflow. The 2019 survey had almost 90,000 (a bit down on the 100,000 from 2018) developers venting their collective spleen on life, languages and loathings. While speed and safety-focused C++-alike Rust retained its crown as the …
A company I worked for ten years ago did a survey on development environments for programming microcontrollers. We drew two unexpected conclusions from the respondents:
1. People that use Linux were more likely to fill out surveys than Windows
2. People that use open-source languages and tools were more likely to fill out surveys than anyone else
I don't know if this still holds true today.
I'm also reminded of when DataGuess ran a survey on popular 8-bit microcontrollers around 1994 and the results said the market was dominated by the National Semiconductor COP800. However, those of us that worked at NSC knew that the COP800 had a minuscule slice of the market pie compared to Motorola's 68HC05 and 68HC11. The COP800 fans loved the MCU so much (we called it The Rocky Horror MCU) they were more likely to fill out the survey than anyone else.
This all sounds quite plausible.
There's people who are very serious about this stuff. They do surveys, work on open source projects, write blog posts. And then there's people who turn up to work, do their job and go home. Those people don't tend to be that fussy about what they use. The boss rather likes them, because they won't get fussy if they have to fix an old system built in VB6, they'll get on and fix it.
And I guarantee there's a lot more jobs fixing VBA out there than Rust programming.
When I first heard about Rust some years back, I thought it sounded really interesting but I thought to myself "Who the hell is going to hire a Rust developer?" Little did I know what was in store for me. The medium sized company I joined recently chose Rust as their new primary language going forwards a couple of years ago and hasn't looked back. I haven't picked it up yet as that's not what they hired me for but I look forward to learning it at some point.
Yes, I'm looking forward to getting into Rust too. Its a language that seems to offer much for productivity, as a whole range of bugs (especially memory misuse) that have to be tested for or discovered in debug / review for C++ are found by the Rust compiler. Plus it's potentially universal - microcontrollers to desktops to servers to supercomputers. And there's a lot about it that's means you'd seriously consider using it instead of, say, Java. C# (you get a lot of the benefits of a managed language without needing the runtime engine underneath). Rust has transformed Firefox too, so that's a major project demonstrating benefits.
I don't know how far compiler support has got, but I think as soon as Visual Studio picks it up that'll be hard to ignore.
...that WordPress was the most dreaded platform! Having seen for myself what a mess it makes under the hood, it doesn't surprise me...
I was amused to see Chef being dissed and Ansible way ahead. Having gone first Chef, then switched to Ansible in frustration, I find that’s at least one point SO nailed.
Dont get me wrong. Much appreciated Chef for turning me on to provisioning via code. And basically, I was able to port my hard-learned configuration easily because I had first learned under Chef. But it’s way more effort than Ansible, which has a much clearer model both in its layout and execution. Ansible also maps way more closely to the bash commands that you would use if you did it manually. Almost haven’t written any Python under Ansible, while I had to do lots of Ruby under Chef.
Lines of code is cut 50-60% under Ansible, but more importantly I have 2 control files in one directory instead of dozens in disseminated highly nested directories w Chef.
Note: at large scales Chef is supposed to be quite a bit faster.
Seems RUST might be loved, and I can see why. If only I had the time I would get right into it but....
...according to Tiobe, it's the old classics plus python that are the busiest on the web. Probably due to Unis and colleges. Though they are starting to sidle away gently from the dreaded java.
Python, at least in the UK, is taught in secondary schools and is becoming the de facto hobbyist language, for instance in the Raspberry Pi community.
Add to that Django and FLASK. Add to that the mathematical numPy and SciPy stuff that the Unis are starting to adopt for sciences that are not computer science.
So I think there's a bit more to it.
The important point is that he is a proponent of using Jupyter notebooks (a kind of online live spreadsheet/scripted macro blend) to share the data and methodology in research papers.
(Don’t care much for ML myself. Very useful I am sure, just not to me.)
Agreed, I git 14 downvotes for pointing out some of the problems of Python, obviously there's a lot of ignorant people out there who have never seen anything else!
The best language for the Pi is of course C, but while people are de-skilled by being taught Python the Pi will just have to struggle by on a bloated 4G language.
I like Julia, but it seems to me numPy and SciPy are big obstacles to replacing Python for a lot of numeric-intensive work. And, of course, the Flavor of the Day comes and goes but legacy code remains; there's still plenty of Fortran calling BLAS.
But that said, I don't do much work professionally in numeric or scientific computing, so I have only the most casual familiarity with what the cool kids are doing.
One of the nice things about Rust is that you can also create bindings for it for other languages. At one of our sprints someone did this for Python with the Rust JSON parser. This was just done as a proof of concept, of course, as there are already several very fast JSON parsers such as uJSON..
Those pathetic "stand ups" that fundamentalist acolytes of Agile adore and seem to think its absolutely essential that the rest of the team know that Dwayne is still working on almost exactly the same code that he was working on yesterday and the day before and the day before that and will be working on tommorow too. And obviously Dwayne is thrilled to find out that Tom, Dick and Harry are also doing almost exactly the same thing as they were doing yesterday as well and it will help him no end in helping to expidite in leveraging his vision for his code blah blah fucking blah.
A while back I worked in a team run by some child project manager who didn't have a fucking clue how to manage people and nor could he hack his way out of a wet paper bag, but was under the impression that if he wasted 30 mins of our time every morning with these absurd stand ups it was job done for him. How the hell do these people ever get employed and why is this moronic methodology so popular?
The thing is that the standup evolved from daily meetings in Japanese car factories (known as asa-ichi) and the key thing there wasn't "what are you working on", it was about quality problems and process improvement. So, let's say that it's taking more effort to get an exhaust to fit right, or they think of a way to do the windscreens better, that's where you discuss it.
In a development context, that might be something like you're noticing that the builds are failing because of lack of memory, so you discuss how to take that forward. Or, you find a Visual Studio plugin that saved you time, and maybe we should be using it. Meetings are ultimately about an interchange, and that's why standups where people say what they're working on are stupid. That's nearly all between you and the manager.
Korea might have been a better cultural example than Japan but the issue is that many people will not actually turn around and say what they are struggling with until there is a meeting OR some other method. One thing I have seen work well is breakout activities from lego or mechano building to table football etc. This does 2 things, for the person stuck it frees them from the screen to think, often there will be another person stuck and explaining to each other often helps to put the thoughts together. Whiteboards are also very important, drawing things helps clarify. Acting things out also works. especially for c++ object stuff or systems with many processors can also actually help. All of these means you need space to do it, people willing to try it and equipment.
Bullshit. I've been a professional developer for over three decades, in a variety of organizations, from startups to big, well-established firms. I've developed device drivers and distributed application engines. I've written compilers and UIs. I've worked with everything from 8-bit embedded systems to mainframes and supercomputers. In working with customers I've seen a great many arcane legacy business applications. I have never seen a "large coding task" which couldn't be decomposed into reasonably-sized pieces.
Care to describe one?
Reasonably sized pieces is not a daily chunk for a sprint box to be ticked. And my length and breadth of experience is pretty much the same as yours so stop trying to impress. An example? The last place I worked required extremely complicated data calculation algorithms to be implemented. They took about a month to get right. If I'd had to waste my time in a stand-up I'd have said pretty much the same thing every day for that month.
Daily stand-ups are OK if they're kept short and waffling is discouraged. I suspect that one of their benefits is that having to report on what they're doing keeps developers' noses to the grindstone.
The most time-wasting of the agile ceremonies is the Sprint Review - what went well (nothing especially), what went badly (same as last sprint), bits of paper commending team members (cringe), Post-Its invariably falling off the wall.
Dwayne is still working on almost exactly the same code that he was working on yesterday and the day before and the day before that and will be working on tommorow [sic] too
If that's what you're hearing in your stand-ups, You're Doing It Wrong. If developers are reporting the same status for several days, they're not agile. The team needs to be breaking that task down into pieces small enough to be reasonably managed.
That said, the agile teams I'm on never bothered with standing, as we all thought it was stupid. There are agile practices which are important for realizing the benefits of agile development, and others which are just ceremony. A team that can't keep daily status meetings - scrums or whatever nomenclature your organization uses - to a reasonable time without forcing everyone to stand is broken and not agile anyway.
The real problem, of course, is that agile methods aren't magic. There are some good ideas there which can be useful - if they're adapted to the particularities of the organization, team, and project, and if the team members make a good-faith effort to work with them, without either resisting for the sake of resistance, or adopting it as a religion.
I've been doing agile development for a decade and I've seen considerable benefits for the teams I'm on and interact with. Planning is much more accurate and useful; we still slip features, but we know what's going to slip much earlier, and we rarely slip delivery dates. Team members are far less siloed; we're much better at reassigning work among team members as capacity (and inclination) changes. People are much more comfortable about asking for help. Communication in general is greatly improved, which has had a positive effect on code and documentation quality.
But all of this very much depended on developing processes and conventions that worked for us. And continuing to let those develop as the teams, the people in them, and the work changes.
I had a laugh today. We were in a presentation and our Scrum evangelist showed a slide showing how we scored on a scale 0-5 for various scrum things. The first column was how we did for "Scrum ceremonies" and apparently we did very well on that, nearly a 5, but the other columns were 4s and 3s. We got a 2.5 for having predictable releases.
It reminded me of some kind of mind bending religious cult questioning why the UFO containing celestial beings hadn't arrived - after all they had performed all the prayers, incantations, sacrifices and ululations perfectly so what could be wrong? Perhaps the aliens were angry? It never occurs to them that they're out of their minds and the aliens aren't coming no matter how well they do their incantations.
That's scrum and most other development processes for software. Release code that the customer wants and keep all the irrational bullshit to a minimum. A process of some kind is necessary but frankly it doesn't matter what it is providing that it isn't time-consuming and every knows what they're doing.
Because I would want to understand why everyone is working on exactly the same crap everyday and apparently making zero progress.
There are ways of making these standups productive and to be honest in many scenarios they are not needed and I would ditch them in those cases.
I worked in a team run by some child project manager
Got to deal with one as an interface - Sometimes the title 'project manager' should be 'project secretary' instead. No management, no vision, no decision, no psychology, and thinking that sticking to Gantt charts is the only way to deal with projects... :doh!:
There’s nothing wrong with meetings, as long as they actually serve a useful and productive purpose.
But, with modern comms, many (but not all) meetings could equally well, if not better, happen via email or text chats. This has the dual advantage of allowing comms to happen asynchronously in not-quite real time (let alone any obligation to get to the same physical place), and the minutes/outcomes also effectively mostly write themselves.
Unfortunately, many of the sort of manglers who see meetings as the only solution (rather than just one available tool in the toolbox) don’t grok this sort of comms technology (it could perhaps be argued that that sort of mangler doesn’t grok very much of anything, which is why they are a mangler: the Peter principle at work).
Quote: "...meetings could equally well, if not better, happen via email or text chats..."
True....but in my experience "face time" is the only way to identify some seriously corrosive aspects of personal performance -- notably incompetence and outright lying. Both of these can be hidden for ever behind emails! Oh....and I'm not just thinking about team members.....managers too!!
So....keep meetings short and to the point....but don't abandon them for email.
Current team is 4 people (1 is a scrum master/dev), we rattle through standup in 10 minutes. We all know what each other is working on and where we will stamp on toes etc. Retro in 45 minutes. It's all golden.
Last team however, team of 9. Standup was 30-45 minutes each day. Retro would take 2hrs+. We had one of those people who think they're getting marked by how long they talk for, and how many positions they take. Every proposal that anyone made came with a 5 minute monologue from the loquacious one about why he disagrees. Every code review of his would take days and days to resolve, because he'd take any comment as a challenge.
When the project finally finished, and we're re-assigning to different projects, they asked what I wanted to work on - "anything, but if the team has Mr Talker on it I'm looking for a new job".
Apple should have binned Obj-C 20 years ago and gone with C++, then they could have pulled in a far wider base of coders to write code for their systems and would have got the benefit of a modern language, not one that fossillised in the mid 90s. But no, instead they stuck with a language no one really wanted to learn and to replace it they came up with yet another language no one outside the apple ecosystem would give the time of day to - Swift. You really have to wonder what they're smoking inside that reality distortion field.
Apple should have binned Obj-C 20 years ago
If they'd tried to to that they would probably never have been able to release OS X… which was basically NextStep with some stuff bolted on. Objective C has received lots of plaudits for not being C++ and avoiding many of the pitfalls as a result; indeed many people thought that Objective C was the better approach. Swift maybe the result of NIH but both languages seem to be doing well enough for the developers working on MacOS / IOS stuff. Languages like Dart, TypeScript, Kotlin and Swift we can see that there is almost always a need for a higher-level language pendant to the lower level ones.
Its easy to avoid the pitfalls of C++ if you only have 1/10th the functionality. And from what I've seen of OSX code, most of it is written in C/C++ and people only use Obj-C when they absolutely have no choice - eg to access system or GUI functionality.
Apple "inherited" Objective-C from NeXTSTEP probably because of Steve Jobs and the lead engineers who worked on that also worked on OS X. So OS X got Obj-C and the Cocoa frameworks were similar too.
While it's an interesting language, it lost hard to C++ and doesn't really offer any tangible benefits that justify using it instead of C++. I think Apple has always had that not-invented-here attitude which makes them do their own thing. Replacing Object-C with swift being another case in point.
Python appears to be designed to stop any progress, obviously designed by some new age liberal fascist who thinks he knows best and imposes his dislike of increment, switch-case and basic array features to create a language that is almost entirely useless. PHP is far superior in that it is simple, C like and just works - far more useful for getting the job done.
C++ was always a memory accident waiting to happen, for all transactional programming all one needs is some simple C with a small garbage collected memory library that allows you to write code without ever having to manually track memory use. At the end of the transaction you simply hand back the buffers to be re-used on the next transaction.
Both jQuery and Python remind me of a university project to make work for people, in the real world they are simply noise.
" some simple C with a small garbage collected memory library"
Without modifying the language you can't have garbage collection in C. Sure, you can have automatic stack variables or use alloca() to allocate stack memory which will be reclaimed on function return, but neither is automatic garbage collection in the normal sense. At some point you'd have to manually call a library function to do it which defeats the point. You might as well just call free().
"in the real world they are simply noise."
No, that noise is people trying to tell you stuff you don't want to hear. I cant comment on jquery, but Python is a powerful and very useful language. It has its issues but it does the job and is far better than the now (thankfully) fading Perl which to all intents and purposes it replaced.
It is a conservative GC, so will not collect everything.
Because in C (and C++) you can tell the difference (mechanically) between a pointer and an integer with a similar value because it is valid to pass a pointer around via numeric types (value does not mean good practice)..
Because in C (and C++) you can tell the difference (mechanically) between a pointer and an integer with a similar value because it is valid to pass a pointer around via numeric types (value does not mean good practice).
Only if the implementation defines intptr_t. A conforming C implementation may have no integer type which can hold a valid representation of a pointer. System/C for the AS/400 was such an implementation, for example.
A hosted C implementation, however, is required to support successful conversion of pointer values to and from strings, using the %p fprintf/fscanf specifier. The restored value is only a valid pointer if the source and destination pointer types are identical (modulo storage class), of an object pointer type (i.e. not function pointers), and not null, and the conversions both happen within the lifetime of the object. However, this is arguably a stronger example of why a C garbage collector has inherent limitations, since the Boehm collector should catch integer-type pointer aliasing (it scans memory for pointer values) but not string-type.
Note that it's impossible to catch string-type aliasing in all cases, because it's valid, for example, to write a pointer value to a file as a string, remove all instances of the value from memory, and then read the file and recover the pointer value. That value must still be convertible into a valid pointer if the conditions I listed above are observed. Thus even a conservative garbage collector can run afoul of a conforming (albeit perverse) program in a hosted C implementation.
Thats not a true garbage collector - its just allocating global heap memory and managing it itself. It has no idea if a-n-other pointer is pointing at any of the memory its allocated and will happily free that memory if asked to and so leaving dangling pointers. Thats not what most people consider a garbage collector.
It has no idea if a-n-other pointer is pointing at any of the memory its allocated and will happily free that memory if asked to
This is incorrect. The Boehm collector - at least the last time I looked at it - will scan the process address space, looking for references to areas it knows about, precisely to detect aliasing. It's possible the design has changed (it's been quite a few years now), but that's how it was originally designed to work.
"will scan the process address space, looking for references to areas it knows about"
I don't want to use any library that is dicking about like that behind the scenes. And it can scan all it likes but it will have no idea if the value 0xABCDEF which may be a valid address in its memory space is actually pointing to it or is simply a coincidental value. Garbage collectors have to grok the higher level program variables, not just low level memory. There is no way around it , you CANNOT have a proper automatic garbage collector system in C.
You are all looking at garbage collection in the WRONG WAY, it's very simple to write a 100% effective GC system in C for transactional tasks (and some others).
All you do is write a simple block based malloc and hold the blocks in a list relating to context. I.e. you have my_new_malloc(int size, void *context). Blocks can be say 1MB each or so, with a separate list for rare items that are too big. As you allocate the malloc is very efficient because you never do or allow and my_free() function - so no looking for gaps - it's a very fast malloc that you create and it creates more blocks as required. When you run out of space you simply jump to a new block allocated with system malloc() or taken of a 'spares' list to prevent memory fragmentation (i.e. only system malloc() once.
At the end of the transaction you have all the memory for that context in a simple linked list of standard sized blocks, there could be for instance 800,000 items in 75 x 1MB blocks. These blocks are now to be 'garbage collected' by simply moving all 75 blocks to your 'spares' list of blocks, hence freeing those 800,000 items by simply re-linking the block list of that context.
You can of course also store the context for any non-transactional items and GC them in the same way when required. My C based CG is therefore faster than the system malloc(), it removes memory fragmentation issues and allows a far faster GC than even the system free() function because the blocks are collected in one list operation rather than individually.
Yet I have 14 downvotes and a list of comments telling me how it can't work or be efficient. I have news for you, it does work, is extremely fast and removed all the issues of having to track memory and so makes string manipulation (with optional, untracked my_new_malloc() calls as needed) simple and simplifies the entire program as very few things have to be explicitly tracked - so memory leaks are also pretty much eliminated too.
My C based GC is therefore:
* Faster than the system malloc()
* Much faster than the system free()
* Helps prevent memory leaks
* Can eliminate memory fragmentation
* Makes the coding simpler and more flexible as memory is automatically 'tracked'
Yes, very much worth 14 downvotes, thanks guys.
PHP is far superior in that it is simple, C like and just works
I think you disqualify yourself with this statement. PHP started as a hack and this is one of the reasons that it has one of the longest list of CVEs for languages out there and comes with a loaded gun already pointed at the user's foot.
Python was never trying to be C-like, as it's modelled on Modula and is object-oriented through and through. Who needs case-switch when you can use dictionary dispatching? But I'm sure you can write shit code in any language.
Actually the PHP syntax is extremely close to C, perhaps you have not used them both to notice.
I've been coding in C since around 1986 and still code in it everyday, and have used PHP for both scripts and web back-ends for years so I think I know what they are both like.
PHP 'just works' for all cases because unlike Python the language works to help, not hinder. In a script I don't really care if something is "12.34" or 1.234e1, I just want it to work and PHP always delivers.
> Never seen the point of jQuery
"Those who do not learn from history are doomed to repeat it" (someone famous).
jQuery in its day was a really helpful way of not having to deal with the incompatibilities of browsers. These days that is much less of a problem (and the DOM and related APIs have been massively improved).
Today it is much less valuable, but too many other libraries (eg. Bootstrap) have taken dependencies.
Objects only clean up after themselves only if you actually remember to destroy them so their destructor is invoked. Scoped and shared pointers help but they're not some panacea.
And RAII is great and all but imagine this scenario - I write a C++ class which wraps a C library, e.g. libjpeg. The constructor loads the JPEG and the destructor frees the buffer up. Simple right?
Now imagine I use a local instance of this class to load the JPEG and push the variable onto a vector. Oh dear my code just crashed. Why? Because my local variable went out of scope and cleaned up the JPEG but the copy in the vector also points to the same data so it will crash when it is accessed or when the vector is destroyed.
So now I have to implement a copy constructor that says what to do if two objects exist pointing to the same data. Maybe I refcount the data and release it on the last reference, or maybe I make copies on a copy. Maybe I need copy on write behaviour in case one object modifies the data. Ah but copy constructor also has to copy I assign a variable to itself so it doesn't blow up. And C++ says if you override the destructor or the copy constructor then you should also override the copy assignment operator too. But wait! C++11 also says you should override the move constructor and move assignment operator too. So potentially have to write 5 functions by hand to stop my simple wrapper from crashing.
But that's not all - now someone wants to inherit my class and my destructor wasn't virtual. So now my code has the potential to leak because the destructor wasn't called. And I might have lifetime issues if the libjpeg is shutdown while I still have instances of my class floating around.
So this simple RAII object can be a goddamned mess. Yes I could write code to account for all these scenarios, or perhaps disable copying by inheriting from a noncopyable class but in the real world, these kind issues are the sort of thing that makes C++ a source of so many bugs.
All your problems stem from using something written in another non object-orientated language and having to wrap it into an class yourself, so you need to manage the memory and add operator support. libjpeg even calls exit() at times and you need to work around that so you're hardly starting with a good base to build on.
But that's not all - now someone wants to inherit my class and my destructor wasn't virtual. So now my code has the potential to leak because the destructor wasn't called.
Public base class means you need a virtual destructor. IDEs will even warn you about it.
And I might have lifetime issues if the libjpeg is shutdown while I still have instances of my class floating around.
Closing the library down should go through your class.
Actually the majority of my problems originate from using C++ as the language that allow them to be problems in the first place and the mess of code I have to write to make those problems go away.
If I were to write a RAII wrapper in another language, Rust for example, I wouldn't remotely have as many issues as this. The struct would move on assign so no copy constructor issues. There is no inheritance so no virtual destructors and I could ensure the object couldn't live beyond the scope of the thing that owns it with lifetimes.
I don't think your problems are insurmountable, your first two complaints are addressed in C++11. As for your third complaint I cheerfully admit have no idea about Rust lifetimes.
You seem to want to be protected from using C++ in the wrong way, as if somehow the compiler knew what the right way was, but C++ and C don't work that, if they did they'd be as bureaucratic as Java or would just remove options as Rust appears to.
Never seen the point of jQuery, a bloated framework that fills up the internet with questions
It's a good example of what happens when some self-important idiot who can't be bothered to read the specification for a language attempts to build a library on top of that specification, then throws hissy fits when his errors are pointed out.
C++ was always a memory accident waiting to happen, for all transactional programming all one needs is some simple C with a small garbage collected memory library that allows you to write code without ever having to manually track memory use
C++ is a language for people who know what they do, what they want and how to do it. Garbage collectors are a bypass for lazy and / or incompetent coders.
You'd hope then the taxes would be slighty lower if they dont include all the trimmings.
Anyone from the US want to give some ball park figures?
for comparison , the UK system , roughly , is - you'd pay a third in taxes earning up to £40k ( $52k) and a chunk more above that, plus we also have VAT - equivalent of your sales tax. at 20%
That's very roughly rough. It's no tax up to 12.5K. Then for the amount between 12.5K and 50K it's 20% (still zero on the first 12.k).
Then on the earnings above 50K and below 150k it's 45%. So if you earn 55K, you only pay 40% on 5K.
Plus NI and any pension contribution that is topped up by employer.
Remember, for the SO survey you're really talking about "person whose job happens to include working with language X", not necessarily "person whose job is primarily writing code in language X". Those highly-paid Scala developers may not write a lot of code, by whatever metric (function point or whatever) you care to use. They may be spending a lot of time refining algorithms or creating DNN architectures or who knows what.
Regarding the subsequent discussion on US salaries versus those elsewhere: I don't know that comparing numbers is helpful anyway. What matters more is parity purchasing power - does a roughly-equivalent job in the US give you the ability to buy more stuff than it does elsewhere? And even then, we'd want to somehow correct for things like what kind of stuff people want to spend money on - what's available, what cultural expectations are in play, and so forth.
So, for example, my wife and I own two houses. (With mortgages - we don't own the free and clear.) For most of humanity, now and historically, that makes us ridiculously wealthy. For that matter, in terms of annual income, we're in the 1% of both states where we live - though we're hilariously far away from the national 1% (which is concentrated on the coasts) or the 0.1% of either state. Would that be true if we were in the UK? Well, we almost certainly wouldn't own two houses in the UK (though we might in another European country).
But we'd probably still think of ourselves as being something like upper middle class. We'd probably still drive the same sorts of cars (except with manual transmissions, which would be a nice bonus), eat out about as often, take similar holidays, and so forth. Air and rail travel would be cheaper; driving would be more expensive. We wouldn't have massive student loans. We wouldn't be paying so much for medical insurance, but then we mostly don't notice that - it's done by payroll withdrawal. Would we feel richer or poorer, assuming we'd always lived there, than we do now? I don't know that we would.
One of the things that all the cool languages have to have is type inference. Don't bother declaring a type - the compiler can work it out. Once on this slippery slope they start to allow you to omit anything else that the compiler can work out.
This is all very well when you're writing code. It saves keystrokes and gets the job done quicker, although tapping keys is, in my experience, not on the critical path when coding. But it's a real pain when you're maintaining or debugging someone else's code, because you have to do the inferences yourself.
Modern IDEs manage can do the inference for you. If you use auto types in Java, C++, Rust, Go, whatever, the plugin will usually decorate the type somewhere.
Most languages with inference also require you to be explicit about the type in the function / struct signatures. They also allow you to be explicit in places where it makes sense to do it. And of course the compiler still cares for strong types, even if it can figure out what the type is in a lot of places.
Anyway, the purpose of inference is that it makes code terse which can make code easier to maintain and less prone to error.
Anyway, the purpose of inference is that it makes code terse which can make code easier to maintain and less prone to error.
Up to a point. You're looking at a statement like:
val|var|let|const something = someObject.doStuff(foo, bar, baz)and you need to know what the something is. So you have to find out what someObject is, then look at its methods to see what this signature returns. If the type was declared, you could save time and, more important, distraction from the task in hand.
IDEs are sometimes helpful, but it rather depends on the language. IntelliJ with Typescript - not very helpful. With Groovy - WTF is all this undeclared stuff that seems to appear by magic?
It's a really nice language with a modern package management system and a genuinely helpful compiler.
It's also a language that fixes a lot of errors that C/C++ compilers don't even care about - null / dangling pointers, double frees, buffer overflows, data races. All these are stopped by design in the language or by the compiler.
Even if you go back to C/C++ it'll have taught you things about writing safe code that will serve you well back there.
I went to their site and it's all market-speak about how great it is. So I went to the documentation and then through to "the book".
Nowhere did I get a simple page or paragraph explaining what Rust IS, and why I should want to learn it... if it's compiled, what platforms it supports, yada yada. I am sure I find this but I'd put that on the website if it was me!
Rust is a compiled language. It runs as fast as C or C++ but without exposing you to many of the dangers of C or C++. Therefore your code has less bugs and therefore it is more reliable. It also has a modern package management system, is extremely portable, integrates well with C/C++ and is generally a more terse, simpler language.
On the flipside, the compiler is extremely strict and it takes some getting used to. It errors on things a C++ compiler couldn't care less about, e.g. two threads writing to the same shared data without protection and therefore there is an upfront burden to write safe code for it to be turned into an executable.
While I do like reading these types of surveys, if only to learn about languages/platforms I've never heard of, they do miss out on a couple of critical issues that makes not really reliable as a good source of metrics.
1. As has been mentioned by a number of Registerites, the type of folk who complete these surveys perhaps skews the results somewhat.
2. The results are not correlated to the task. If I am writing a hardware driver or some bare metal code for an application specific chip I will probably go for C/C++ if a compiler exists for the CPU or assembler. However, if I am part of a team building the next big money spinning global solution using C/C++ or assembler would perhaps be quite low down on my list of language preferences. As there are many more high-level projects being built world-wide than, say, single-chip embedded devices the higher level languages will always show higher than the lower level ones.
What would really be interesting is if a PhD student looks to do a properly audited survey with proper controls etc. Now that would be an interesting survey to read...
Biting the hand that feeds IT © 1998–2019