Re: How do they get away with...
A PC isn't fast enough to render a CGI movie either - see 'render farm' , a rack or room of racks contains CPU/GPUs.
10622 publicly visible posts • joined 21 Jul 2010
>Also it would probably help to put the brains in the keyboard part rather than the screen
The 'brains' of an iPad weigh very little. The battery probably accounts for a fair fraction though.
The MS Surface Book approach was to place a second battery in the keyboard section for the weight balance reasons you outline.
One assumes they drop a load of iPads to empirically determine how much force they can withstand before breaking, then calculate how much polyurethane case to enclose an iPad in to meet their specifications.
Otter Box started out making flight cases to house military kit before they used their reputation to flog phone cases.
> Real cost, privacy, security, availability
For an engineering company, the issues of cost, security and availability are *exactly* why they would want their CAD models to sit on their own cloud hosted on their premises, with engineers accessing them via a terminal, X Windows, a browser, whatever. This means their IP doesn't have to leave the premises, and isn't stuck on an individual workstation inaccessible to colleagues.
You're a lone wolf, I get it, but most engineering projects involve professionals from various disciplines working together.
>Being able to drag around the 3D model of sprocket (fine, three sprockets) in a browser window may be good enough for the 3D-printer-happy "maker" generation but is not something I would call CAD software
@Dropbear. I wasn't for a moment suggesting the CAD software run on ARM, merely that it doesn't matter if the *terminal* runs on ARM. The CAD model itself is running on a load of Xeon in the cloud ( or on your local network) the terminal just needs to accept user input and display the workspace.
Back in the nineties we were using CAD software on a UNIX mainframe accessed through X-Windows from a terminal - before it was practical for standalone workstations.
Conceptually this was no different to accessing computer resources on the cloud. The geographical location of where the model is being processed with respect to the user has absolutely *nothing* to do with how many components, sprockets or otherwise, can be in the model.
CAD can be very resource intensive, especially the kinetic simulations you're attempting. That's *exactly* why Fusion 360 and others work with Amazon Web Services - to bring you even more RAM and processing power than you can fit in your desk. The alternative is to buy your own CPU/GPUs, but then you have to use them enough to justify the investment.
Simulations, like renderings, are a good example of a task that doesn't require human input whilst then computer is crunching the numbers. Depending on which package you're using, you might find that it allows you to install clients on your other machines on your local network so that their CPU/GPUs can be harnessed to speed up the calculations.
> Speaking as an experienced design engineer, there's a lot of IO on big CAD models. You can take my machine with 3 SSDs, 26 cores and 128GB of memory from my cold, dead hands
The model is big, but it can worked on remotely - the only data travelling between your terminal and mainframe (or cloud) is user input and video output. Indeed, for bigger models than your's (think of visual effects studios) a render farm of clustered resources is used.
It's not for everybody, but in many circumstances a cloud CAD instance is suitable - plus you can rent more CPU/GPU power as you need it. Then of course there are the collaborative working methods common in many industries - colleagues need to work with changes you've made in real time, so having a model stuck on an individual workstation is not ideal (though there are ways of just uploading your changes rather than the entire model)
By contrast, dumping a day's shoot of 4K footage up to a cloud is impractical over most broadband connections.
It's been a surprisingly persistent myth that only hipsters use MacBooks - it might just be that they are the most visible users in coffee bars. Generally the baby boomer generation has more money, and it should be clear its a demographic group Apple have always aimed at, regardless of their advertising.
For your camera club, the Windows ecosystem was a bit late to support a few important things for photography though they're now mostly fixed. First, traditionally Macs had colour management, then Windows caught up. Then OSX had better support for high resolution monitors, then Windows caught up but Adobe dragged its feet (so that Photoshop would have ridiculously small UI elements on high Res displays). Macs had 16:10 displays whereas PCs fell almost universally into 16:9, until the Surface range helped prompt PC vendors into producing laptops with 3:2 screens.
In fairness to Microsoft they evidently now know what is required (the colour accuracy of their Surface Pro is highly praised, stylus support is good, resolution very high, aspect ratio is useful), but in the last decade there have been times when their software and hardware partners have been slow to wake up.
> Autodesk already have a full 3d cad running in the browser.
Exactly. So do OnShape, using a desktop or Android app as a client. This is important because if it works as advertised I'm no longer tied to any platform - a Linux laptop or Chromebook becomes a viable option. Even MacOSX doesn't have all the main CAD packages available for it, such as Solidworks ( which package a company uses is often determined by its clients and suppliers).
It's worth noting that CAD (unlike video editing with its high IO demands) is suited to being run in the cloud - its descended from the mainframe / terminal model, multiple engineers may be accessing the same model simultaneously, and models hold commercially sensitive information that's best not held on laptops floating about the place.
Whilst charging customers per month is appealling to CAD vendors, it does open the door to competition who might charge per task, or per hour, which would work out cheaper for smaller companies for whom CAD is only a part of their work flow.
@ Steve Todd
Quite right, there have been technical challenges, and Intel's difficulty in moving to a smaller process size has been well reported. My point largely stands though; being 'as fast as the fastest PC' [at some tasks] isn't that high a bar. The amount of money Intel throws at solving their technical issues is determined by the market though, and if gamers are advised that most games don't benefit from anything faster than a Core i5 (since the game engines have evolved with GPUs) then the demand for high end CPUs isn't as high as it might otherwise be.
The last Mac Pro was very much tuned at a workflow that wasn't coding - ie, video editing and thus shunting a lot of data very quickly and frequently between onboard and redundant off board storage. That use case is almost the last justification for a workstation, since other workflows such as CAD or animation can just have a GPU farm steered by a modest desktop.
Which is a long way of agreeing with you; it's hard to see what next year's Mac Pro will do for you over a Mac Mini. Tim Cook says it will be modular, but ultimately how much power do you need on your desk before building a render farm out of commodity hardware seems like a good idea?
I'm watching all of this with interest, including Chromebooks. My laptop was fast enough for the level of CAD work I was doing years back, provided I didn't mind waiting a few minutes for a render (which I only rarely needed to do). Therefore any new machine has not been a necessity, and I can watch these platforms - iPad Pro, Win 10 on Surface, Chromebooks - develop with disinterst.
Back in the day I had no choice over my platform - CAD software essentially dictated I used an Intel Windows machine.
Features that were available but exotic a few years back, such as good stylus support, high Res displays, external GPUs, an alternative CPU architecture - are now or are becomming mainstream.
What my mobile workstation may be in a few years time I really don't know. It would be interesting if Photoshop or a similarly polished and featured alternative was available for ChromeOS.
Relax, we don't know the context of the "as fast as the fastest PC" quote - the exec might have describing the speed of performing a particular quick task in Photoshop or iMovie or something. Desktop CPUs haven't been getting much faster at single thread tasks in recent years anyway - because they've been fast enough, focus has been on improving energy efficiency and or just adding more cores.
What Apple actually displayed as a bullet point was 'Faster then 92% of the notebook computers available last year" which Anandtech find very plausible:
Apple mentioned during the keynote that the new A12X is more powerful than 92% of the available laptops in the market, which isn’t very surprising given the performance levels we saw on the A12.
- https://www.anandtech.com/show/13529/apple-announces-new-ipad
> FAIL Two grand and it can't open two word document simultaneously? For a 'pro' tool that's pretty much fuck all use isn't it?
Not if the professional in question is using Photoshop with a stylus. In other news, Land Rovers make shit minibuses, Nissan Micras aren't very fast and Lamborghinis have rubbish fuel economy.
It's not even as cheap as some Samsung flagships from last year, which still beat OnePlus is most areas. SD card, waterproofing, Qi charging, better screen, better camera ARcore support. Buy the new OnePlus if you want the latest Snapdragon and more RAM than you're likely to need, and can live without a fair few features.
> Not replacing a working phone every year or two also reduces waste...
Indeed. So buy a waterproof phone with specs good enough to see you through the next few years, stick it in a good case and put a tempered glass screen protector on it. Qi charging will also save your phone from landfill should your USB C port become damaged. I know USB C ports are said to be mechanically durable, but I know one bloke who got paint into it. I've also heard of a USB port being damaged due to a bit of fluff and a 9v fast charger.
> When you make the entire surface of the phone interactive, it means that merely picking up the phone interacts with it.
You'd think so, wouldn't you? However, many phones have capacitive grip sensors in the side* and use software to distinguish unintended input. The advantage of a phone with slim bezels is that if you put it into a case (so that it becomes the same size as a phone with big bezels) you gain protection - cases can be deformed without damaging the phone.
*You can see this for yourself if you can access your phone's hardware diagnostics mode. On Samsung S8 that is accessed by entering *#0*# into the dialler.
> It also ignores the functional imprortance of the bezel. It might look lovely to have a single piece of glass, a sort of infinity screen but I'd rather mine had a little more protection, including from my own fingers.
How strong is your grip?! :) I'd take the smaller bezelled phone and stick a case on it - bringing the total package up to the size of a phone with thicker bezels. If the case gets damaged, it's cheap and easy to swap out for a new one. Plus, the case allows me to choose a level of protection to suit me, and other customisations that could never practically be offered across a range of phones.
Carry my phone in a plastic bag? It's just easier and safer to get a waterproof phone in the first place. Plenty of options from Samsung, Sony, Apple and others. And the plastic bag wouldn't help if the phone were dropped whilst it was actually being used or connected to wired headphones.
Waterproofing is good for anyone who goes hiking, camping, messing around in boats, lives in a potential flood zone (like Tewkesbury, NYC, Miami, Hiroshima..), lives in the tropics, has small children, uses their phone in the kitchen to view or time recipes, likes listening to podcasts in the bath, or is just occasionally butter-fingered. There's a lot of people who fall into at least one of those categories.
Making products durable reduces waste.
> this messes up the video player scaling. So just as broken.
Scaling, cropping or black bars are inevitable on any screen, given common video content ranges from 4:3 through to wider than 16:9. A good video player will give you options*. That said, the most common TV format, 16:9, is less wide than the 2:1 phone screens notches are usually cut from, therefore the video at correct aspect ratio and and using the maximum vertical pixels won't encroach upon the notched area anyway.
It's actually good that phones are no longer a slave to the TV aspect ratio, since for many people video playback isn't the main thing they use their phone for.
* An irritating exception is BBC iPlayer - be it through desktop browser or Android app - because it assumes that you're using a 16:9 screen and one can end up with both horizontal *and* vertical black bars. That's just daft.
(My laptop is 16:10 and my phone is 2:1)
> Where's me 3.5 M-M stereo lead?
Best buy a job lot of a dozen off tinternet - they always get lost. I daresay some are available in hi-vis orange, which would be handy (so many cables are black, but both network cables and guitar/patch cables are usually of any one of a number of colours - for a solid usability reason)
Actually it's a case of form follows function: why not use the area of the phone that is in line with the camera and ear piece to display notifications? Indeed, LG did this in 2016 by placing a small monochrome auxillary display next to the camera, thus freeing up part of the main display for other content.
Pareto analysis suggests that most of the important information is imparted by a small fraction of the available icons - eg battery, cell signal, WiFi signal, clock, missed call, SMS.
Most of the objections to the notch concept are based on an individual's sense of aesthetics - I haven't read any objective reasons for disliking the notch. [ OLED panels, even rectangular ones, are laser cut - thus adding a notch only increases the tool path a small bit and doesn't require a whole extra process which would increase cost]
> Even iVerge had to grudgingly admit that the IphoneXS is slightly worse that the 18 month old Pixel 2
You're taking rubbish. That Verge article was by Vlad who places a lot of emphasis on cameras - and he's a big fan of the Pixel camera. He also reviews dedicated cameras for the Verge. In the past he's championed Android phones that were worthy of promotion, such as the Xperia Z3 Compact.
https://www.biometricupdate.com/201805/apple-receives-patents-for-ultrasonic-fingerprint-sensors-and-in-display-fingerprint-system - May 2018
Of course you folk don't need reminding that an Apple patent is no guarantee of Apple implementing said patent in a shipped product.
> Shame Apple never managed to get this [ under the screen fingerprint reader] to work, and of course, they'll now never introduce it
Apple have researched ultrasonic fingerprint scanners for use under a display - as have Samsung. This approach is faster and more secure than the camera approach that OnePlus are using, but is more expensive.
Apple may or may not use such a sensor in future - guess it depends on what they learn from their user's Face ID experiences, as well as engineering challenges to bring an ultrasonic sensor at an acceptable cost and size. Reports suggest Samsung may use an under screen ultrasonic sensor on some models, whilst a cheaper version of the upcoming Galaxy S10 'Retro' (it doesn't have a 'Edhe' curved screen) having a side-mounted conventional fingerprint scanner, a la Sony.
Some people of all sexes evidently enjoy pvc costumes. My issue with The GIMP is that it doesn't contain the word paint, photo, picture or image in its abbreviated form - an admittedly small hurdle for a novice user, but small hurdles do add up for someone who is learning.
I don't know why some advocates of GNU/Linux are loath to make the system easier for would-be users.
I didn't say Xiaomi's mechanical watch was competing with Apple - it's just a time piece. It does look like an Apple watch though.
The interesting thing is what the article said - Xiaomi crowdfunded it, and backers have the added confidence that the parent company has a decade's experience of bringing products to market.
> I suspect that what people want is for it to be cheaper, easier and quicker to get someone else to do it for them. That is the modern way, sadly.
Every job is quicker the second time you do it. If someone who has done the job umpteen times can do it in a fraction of the time it takes me, then what's wrong with paying them to do it? It's merely specialisation and it's far from modern, since it dates back to the development of agriculture and permanent settlements.
I haven't played with Bitcoin, but I'm coming to see it as a version 0.9, a first draft - something that others will play with and identify room for improvement. Examples include:
A few years back there were crypto currencies that attempt to sidestep Bitcoin's fluctuations - fixing the value to an established currency, for example, or a commodity good. I understand the Ethereum is moving from a proof of work to a proof of stake model, in an attempt to sidestep excessive electricity consumption.
Does the US Dollar value of one Bitcoin have much to do with Bitcoin's usefulness to a user? Or rather, was Bitcoin's previous high value not merely the result of speculation, speculation that was actually a barrier to its adoption as a practical means of paying for stuff?
In the physical world, a wallet is just as useful for holding 30,000 Dong as it 10 Quid.
> What did Windows, OS X, and Linux bring to the party apart from hardware improvements just to be able to run them in the first place?
As a '90s gamer, DOS was my main environment, and Windows was largely ignorable. RISC OS just wasn't available for my x86 machine. I'd used RISC OS at school just as I had used Atari ST and Amiga equivilents at friends' houses (and was jealous of their sprite based games libraries).
OSX seems to have served Apple and their users well, especially those (mainly DTP and music professionals) who hung on in there during the '90s. I've not heard of RISC OS being considered by Apple before they brought in NeXT.
> And I'll tell you what else. How about a nigh-on unhackable ROM-based, compact and lightning-fast but still fully-featured, and Open Source, OS to power the IoT?
It would, but the trouble with it being ROM based is how to patch a vulnerability that is undiscovered at the time of shipping - as we've just seen with Amazon's Free RTOS. To quote Douglas Adams "the problem with something that is designed never to go wrong is that it is a lot harder to fix when it does go wrong than something that was designed to go wrong in the first place".
Of course you could take a small RTOS and subject to it to much scrutiny, or take a mature RTOS on the grounds that any vulnerabilities would have been found by now. There's also active research in OS kernals that are mathematically proven to do what they're supposed to - Formal Verification.