I would do anything for an upvote...
...but I won't do that.
1513 posts • joined 10 Jun 2009
" 2) "VR" Games. They suck. I don't want an immersive VR experience, where I have to mime getting out of the car and twisting off the gas cap in order to refuel. I want to press "Y" while next to the pump and have my tank meter zip to "F" on its own. In fact, I don't want to even stand up.
3) No/crap support for non-VR games. The main reason I want 3d is for stereo 3d, not for a VR experience. I like traditional styled games, where you sit and use the keyboard or maybe (for the less cerebral games) a console style controller. "
Lots of development houses thought that way, and started trying to do traditional gaming in VR (the original Oculus dev kits didn't come with motion controllers, after all) but in the end, they all came to the same conclusion -- that the sense of presence was too much, and using a controller just felt weird.
VR security is a non-issue. Not because it isn't a potential problem, but because either your headset is just a fancy display controlled by an external computer/console or it's a display with an Android device built it. That means VR security is IT security, and business as usual, not a new category. VR attacks will exploit the exact same attack vectors as every other attack, but there will be far fewer of them, as it's a small target group. Furthermore, with the exception of the Quest and similar standalone units, it's going to be pretty much impossible to identify potential targets from internet metadata, as there's nothing to set a VR-equipped PC or Playstation apart from non-VR ones unless you happen to be browsing the web from a inside a headset.
" both financing its operation and financing its abolition. "
Well, there is the question of morality vs pragmatism. If the world's run by slaves, conscientious object costs money that might render you uncompetitive. Then eventually you get the chance to do something about it.
Or maybe it's just that families are made up of different people with different views.
If you look at the PEP, one of Guido's justifications was that reviewing existing Python code, he was finding plenty of examples of people duplicating work (eg [ f(x) for x in x_list if f(x)>0 ]) or doing redundant work to avoid nested ifs, and he saw this as a solution for real-world problems.
The other advantage is removing what the linguist part of me would call "long-range dependencies". The closer the assignment is to its use, the easier it is to reason through the code. Particularly, mathematicians and scientists are used to reasoning through things in semantically dense formulas, and less used to the step-by-step imperative style.
When you remove the possibility of =/== substitution bugs, the main danger of C-style assignment is gone.
The main reason I see the walrus as a good thing is in list comprehensions.
newlist = [ f(x) for x in oldlist if f(x) > 0 ]
Now imagine the that f(x) is O(^n) or O(n!) -- you've either got to take the performance hit of running it twice or expand the code out. For a scientist or mathematician (and academia is a major target audience for Python), this one line, pseudo-mathematical, pseudo-functional approach is much more readable.
My personal preference would have been for local variable declarations, eg:
newlist = [ result for x in oldlist if result > 0 where result = f(x) ]
but that's not the way they chose to go. Instead we have:
newlist = [ result for x in oldlist if (result:= f(x)) > 0 ]
...which is less restrictive than my way, and Guido set out good reasons for it. But when you look back at the examples, I think it boils down to this: a great many users of Python aren't immersed in the imperative programming style the way most pro software devs are, and the reliance on branching blocks and lines is more confusing to them as it renders code logic implicit. I'm surprised it took Guido as long as it did (v2.4) to introduce conditional expressions, actually, as that's really useful for bringing a single mathematical function into a single line.
On the other hand, it is now getting more and more widely recognised that the YouTube algorithm is a slave-driver, and if you don't work yourself beyond the point of being able to produce decent content, you will be punished by being made irrelevant in the listings.
Whole multi-person production teams spend months to create a couple of hours of television, and YouTube is pushing solo producers to knock out around an hour of video per week to get a basic income. Of course corners get cut.
I'm not defending Raval here, though -- he chose how to react to the pressure. If he'd exited YouTube with his reputation at a high point, he could probably have got himself a fairly decent job. Instead he tried to continue in an unsustainable market.
" It seems such a counter-intuitive thing to do and yet is so widespread. The only explanation I can think of is that users learn it from each other. "
The most underestimated force in UI design is the path of least resistance. Deleting is one of our quickest actions (thanks to having a dedicated key) and if deletion is (initially) non-destructive, the apparent continued availability makes it seem like the most efficient means of archiving.
If we also had an "archive" key on our keyboards, more people would archive properly.
All very much true.
However, the good course designer starts with a particular demographic in mind as his/her target audience, and builds the course around them. A good teacher will adapt the course on the fly if students are finding it too hard or too easy.
Modern digital courseware wants to sell to as wide an audience as possible, which means all notions of prerequisite learning go out the window. Then there's the tendency for everything to be live coding instead of lectures, which means everything's paced by lines of code rather than complexity of concepts.
I'd be more generous and say that he's fallen into the greatest cognitive trap of the crowdfunding era -- the idea that money comes first, then everything else falls into place.
Kickstarter was so swapped with this sort of thinking (for example all the non-engineers who said "give me some money and I'll hire an engineer to create my technologically impossible and/or financially infeasible games console) that they insisted on prototypes first (and now they all head to Indiegogo instead).
The notion of building a course around freely-available information is so much the mainstream that the notion on its own is valueless (how many of your uni lecturers invented and/or discovered the stuff they taught you) -- it's the execution that matters.
What I note as missing from the article is any discussion of quality assurance. If you're delivering a course to that many people, there should be several layers of oversight, and ideally also a fairly rigorous testing process involving teaching beginners... I doubt there was any of that.
It's the thing that bugged me most about Coursera and Udacity when they first came out -- 1000s of people taking courses that had never been run before, and many of which (on Coursera and EdX, anyway) were only ever run once. No beta testing, no refinement or improvement... all people ever saw was the first draft. In that situation, at least these courses were based heavily on tried and tested university modules, but even then, the change of medium really called for a lot more in terms of adaptation and testing.
That's an odd comparison on several levels.
First of all, the 80s were full on one-man dev teams, like Mike Smith (the Miner Willy games), Andrew Braybrook (although he was part of a software house, most of his genius games were solo coded) or Jeff Minter. With 16, 48 or 64k to play with, it was pretty easy for one person to fill all the memory in a few weeks' work.
But Mike Singleton... well, he did some clever maths-based programming to make games that most people wouldn't have thought possible -- even now part of me wants to believe that Lords of Midnight on a 48k spectrum is just a Mandela effect.
We're now living in an era where technological marvels are beyond the reach of the sole coder, and what indies like Pope are doing is finding ways to use what's already there efficiently to build compelling story-driven games without having to worry about the tech.
OK, so Pope did have to write a custom shader to get the "dithering" effect for monochrome shading, so it was far from a totally non-technical project, but the comparison just seems weird to me.
" Unlike the wizardry used by the HoloLens 2 to calculate the position of the user's extremities, the Quest will employ its monochrome cameras and "AI magic".
Good luck with that – the Quest is hardly the processing powerhouse. "
Processing not required, allegedly. Oculus seem to be saying (https://www.youtube.com/watch?v=K2zLneGGbk8) that this is all done on the devices DSP hardware, and that the DSP hardware was specced up deliberately to leave enough capacity after implementing the basic inside-out tracking for later hand-tracking to be included.
I haven't looked into developer specs in any depth (if I play with VR dev, it's always with the dev kits in Unity or UE) but I'm not aware of any API to allow developers to access the DSP for custom functionality, so I don't believe it's going to have any real impact on device performance. Yes, there's reportedly currently noticeable latency in it, but that's likely to be down to the DSP itself.
"Algorithms have to crunch through tons of data in order to make accurate predictions. Relying on a single individual's data is probably not enough to get it to work. Even if a tool is effective for groups of people, it doesn't mean it'll necessarily be accurate for a single person."
It's even worse than that, though. Setting negative expectations regarding an individual is likely to increase that person's alienation, making the negative prediction a self-fulfilling prophecy. A famous education experiment in Switzerland told teachers that a set of kids were going to be "late bloomers" and highly intelligent. The teachers treated them differently, and the prediction game true. There was nothing particularly special about these kids, other than teachers' expectations.
If convicted criminals are treated with constant suspicion by beat cops, they'll be convinced that rehabilitation isn't possible and bam! -- self-fulfilling recidivism prophecy.
I haven't really been able to examine Eiffel closely, because as soon as I look at it, I see yet another impenetrable jumble of fixed-width letters.
Why are we so obsessed with fixed width? It is hard to read, making it easy to make mistakes.
Why are we so obsessed with plaintext? It's so inconvenient that we spend our lives hacking IDEs to try to present layers of visual meaning on top of the text through colouring and font weight. And still it lets us type illegal code -- syntax errors.
Creating a truly smart language integrated with the IDE would be simpler than hacking the editor to highlight errors.
Just look at the STRIDE language, invented as a teaching language. Almost all the flexibility of Java with practically no scope for syntax errors.
" I take it that you hadn't heard that this process was completed in 1760 under George III? The Monarch's assets were separated off and paid to the Treasury, "
Which is why anything since taken from the people would be accurately described as stolen in a legal sense. Under previous monarchs, many state assets have gone mysteriously missing from high-security locations and no charges have been brought, including several generations of crown jewels. The crown jewels were quite certainly stolen. Who by, we can't say, but the lack of any action suggests someone who is above the law. Some of these disappearances are said to line up quite coincidentally with cashflow problems in the households of the then-reigning (now dead) monarchs.
" and in return the government ran a Civil List returning a set figure "
Ah, so we got back what they stole from us... by buying it? You think having to buy back what was stolen is justice. If so, I'll sell you your own car next Sunday.
Oh, and then you mention the Crown Estate. How did they get the Crown Estate? Now I'm not saying they stole it, but I think the previous poster would. Why did they have rights to own the land in the first place?
If you're honest with yourself and think way back, your early mouse experience was probably like mine -- pushing the mouse to the edge of the desk and off the side because the pointer wasn't quite at the edge of the screen yet... then embarrassedly realising that I could just pick it up and move it back without the pointer moving.
We all had problems getting our heads round the mouse when we started.
For me the issue was rarely the ball itself, but the accumulated crud that had transferred from the ball to the rollers.
One of the PC magazines gave away a free cleaner that I thought was great -- it was a ridged ball on a stick and you just took the ball out then wiggled this in the whole to scrap out the rubbish.
Then I realised it was quicker and easier to use the pocket clip on the lid of a cheap bic biro and just scratch it all off directly.
" There is absolutely no way to ensure that f(x) is idempotent. If you don't understand that, then step away from the keyboard. "
And this is another reason to favour the PEP -- taking the same example
results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]
if the function f is not idempotent, then we now have the possibility of throwing a divide-by-zero exception, which would cause the whole comprehension to be binned. (E.g. first call to f(x) returns 1, but in x/f(x), f(x) returns zero.)
In the case of the assignment expression version...
results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]
... as f(x) is only evaluated once, x/y will never result in a divide-by-zero exception.
Anyone arguing whitespace vs block delimiters is really missing the point -- these are both compromise solutions for coding in plaintext on memory-limited machines.
Modern computers handle block nesting in much more sophisticated ways -- think XML.
Take a look at Stride, an educational programming language based on Java.
The editor is "frame-based", meaning all the block delimitation is implicit, and it's damn near impossible for a coder to mix up the flow control.
It's also impossible to commit any syntax errors, as starting a line with if, while etc leads to a template being presented with all possible slots presented as text boxes, so you can only put something where it's permissible.
Better still, you no longer have to type full commands, with a single keystroke indicating the line's key function: "=" for assignment, "i" for if, "w" for while etc.
It's quicker to code than standard java, it's less bug prone than standard java, and it's made up almost entirely of keyboard shortcuts, and because it saves as XML, you technically could edit it in plaintext if you wanted. It seems to be everything programmers want... so why aren't programmers picking up on it?
I agree that visual programming is very limited.
I think the future of code is in frame-based programming. Check out the Stride programming language used in the Greenfoot and BlueJ educational IDEs. It's designed to let you develop traditional line-paradigm code more efficiently by reducing the number of keystrokes and making syntax errors impossible and scope errors rare.
Every type of code statement has a limited range of possible syntaxes, so Stride turns each statement type into a template where you fill in the boxes. As an educational programming language, Stride maps to a subset of Java, and any Java code can be called from Stride.
It also renders the block delimiters vs meaningful whitespace debate moot, as blocks, scopes and indentation are handled by the editor automatically as the programmer is no longer dealing with plaintext.
I think this frame-based paradigm has real potential to change coding practices in a way visual coding never really did.
" News sites make money off of the additional traffic driven to them by the news aggregators. "
Which assumes that:
A) aggregator sites drive traffic to content providers
B) the traffic driven is of high value.
Whether A is true or not depends on whether you're interested in unique visitors or page-views. Aggregators increase the former, but decrease the latter. This is where B comes in. "Drive-by" readers are less valuable than brand-loyal "sticky" readers.
Overall, aggregators appear to cost content providers significantly.
" Layer your software - assembler/C at the metal/kernel. C++ at the system level. Dynamic scripting language a the application level. "
Where do libraries sit though?
What do you see as the role for functional programming?
And why would you want dynamic scripting for applications when self-modifying applications are a security risk?
The biggest issue with nuclear is that all current practical economics models emphasise short-term outcomes, and the last thing a nuclear reactor needs is a management team that can't see the bigger picture. This really comes out into play when the reactor hits the end of life, as decommissioning hasn't historically always been budgeted for, meaning the operator goes bust and leaves the cleanup to the public purse.
I reckon nuclear operators should be obliged to buy government bonds to insure the cleanup, and if they can do it cheaper and cash in the bonds, good for them.
" It's not an etymological fallacy if it's also used (and understood) in the original sense. It's only the case if the original meaning is almost never used. "
Yes it is, because the previous poster was talking about the etymological argument being used against the modern one. That's fallacious -- just because one version matches the etymology, doesn't mean the other version is wrong.
" The university that I work for in 1996 dropped Computer Science in favour of Applied Computing, realising that industry doesn't need someone who can build a linked list library from scratch but rather knows how and when to use an existing library. "
But this brings up the problem of how to develop the mental schemata to process what you're doing. If all you ever do is work with libraries, you miss out on several levels of abstraction and don't fully understand what you're doing.
The other side of the coin is that if you only ever deal with fundamentals like manually programming lists, you're missing out on several levels of abstraction and can only do real-world tasks within a very narrow domain.
The problem we have is that most courses fall into one of two extreme camps, and few people are discussing the middle-ground. But if you look at things like Stride in BlueJ, we're slowly starting to approach it.
Hmmm... while I like the concept, I'm a little bit unsure about how the data is encoded. My understanding is that classic Mario games were all designed as repeated "chunks" of tiles, hence patterns of blocks that recur throughout the game.
This is non-trivial -- part of the learning curve of Mario is the fact that you become more fluent/competent in these patterns as you play, and then adapt your strategy based on the different obstacles before and after.
A machine learning algorithm based on the individual tiles may spot the repeating patterns and implement them incidentally, but not necessarily... in which case the levels would not be "Mario" levels.
If you're looking at generating levels at the tile level, pretty much any platformer of a similar vintage would be more appropriate, as most other games had fully hand-crafted levels. But then again, 2D Mario games were always huge because of how quickly the levels could be generated, and other platformers don't provide the same size of training set as Mario.
" "The problem UWP faced was the fact that it was something new"
I disagree. The main problem with it was that it was an entirely inappropriate interface for people who weren't using a handheld device."
If the existing Windows API had been genuinely abstract, UWP could have been one API with two or more presentation methods, and the problem is gone.
Windows has always failed to decouple behaviour from presentation.
" "if there's anyone who can make it work, it's Apple."
The Apple that possessed that sort of magic stopped existing a few years back. "
Not necessarily. The ace up Apple's sleeve has been their mostly successful attempts to retain control of look-and-feel in iOS app UIs. The problem UWP faced was the fact that it was something new, and Windows already has layer upon layer of UI cruft -- how many different load/save dialogues are there in the Windows 10 interface for legacy support, for example? Windows never abstracted the APIs enough away from presentation that they could be considered interchangeable.
iOS, on the other hand, was built ground-up on the philosophy "when we change, you change" and "if you don't do it our way, your apps will break. Apple refused to repeat the legacy trap that haunts Windows (and MacOS, to a certain extent) which means they now have an established platform and ecosystem with a high degree of future-proofing. iOS apps that don't rely on specific hardware configurations are already a cat's whisker from being MacOS apps, and it wouldn't be all that hard to rewrite the APIs for non-touch.
" They seem to like adding them near schools. "
That's quite logical -- roundabouts force drivers to slow down, whereas you can fly over a crossroads at full pelt when the light's on green.
Unless you're suggesting that the only solution to a bad man with a speeding car is a good man with a speeding car, and that all schools should have a NASCAR-trained marshal to nudge speeding drivers out of the way of kids, I think roundabouts near schools are eminently sensible.
Betty Boop and other cartoons of the era were targeted at adults. A lot of it was about bending adult expectations of reality, and was at best indecipherable to kids, at worst nightmare-inducing (and I think the Pink Elephants On Parade sequence in Dumbo was inappropriate for kids, harking back to the days when cartoons were a pre-feature item at adult cinema).
" So behave yourself. You are under observation. "
Oh dammit! What we didn't want was everyone acting as though they're being watched. Have you never heard of the observer's paradox?
Bugger it all -- the experiment's a bust. Might as well put the whole planet in the autoclave and restart from scratch.
I'm never going to finish my BSc at this rate....
" I've still got it somewhere, along with an Opus Discovery disc drive that allowed the ZX Spectrum to use 3.5" floppy discs as if they were Microdrive cartridges if memory serves me right (sometimes it doesn't, I think I'm getting old). "
I think all of us who remember Spectrums are getting old....
Biting the hand that feeds IT © 1998–2020