Re: Pointless
> Yeah, but imagine going out for the day with only 40% charge
Read Mike Lewis' comments again. He was saying that batteries being *stored* unused for a while should be left at around 40% charge.
10622 publicly visible posts • joined 21 Jul 2010
And across how many devices did you test these Android version updates?
Also, you note that the OEMs weren't involved in the Android versions... so where did the ODM binary driver blobs come from?
Is it possible that you've done something special to experience flaws that most others haven't?
YMMV, evidently.
Sony Xperia phones used to let you specify a maximum percentage to charge to, say 85%, so the phone would stop charging when it got there. Apple phones, I believe, if plugged in overnight charge to 90% then pause, and then top up before before you wake.
Its features such as these that should be noted and appreciated.
(Missing this feature on my Galaxy I looked at XDAforums and it appears it can't be implemented without rooting. I wonder if the feature can be approximated by the USB charger... Even a simple timer would do the trick)
Phone batteries can be replaced, just not always by the user. In the case of Apple and Samsung authorised repairers the cost is roughly 8% of the cost of the handset. Less if you go back street. You can normally save more than that when buying a handset by waiting a couple of months for a special offer.
> Seems a better match for folk who want to flaunt their ability to have a usuable instance of Excel that fits in a trouser pocket.
Hahaha! That would only impress a special kind of woman. When you find her, hold on to her!
Why am I thinking of Gary Larson's 'Punk Accountants' right now?
It's not a folding phone; it's a unfolding phone, and folding tablet.
Not just semantics: it reminds us that to use it at its best requires apps tuned to a tablet. And by any metric, the ecosystem of apps for Android tablets is lacking - the iPad having the lion's share of developer interest and revenue. Heck, even Google is more interested in ChromeOS for tablets than it is Android.
Apple are testing the technology, but they don't have to rush an unfoldable iPhone to market. *If* they do, however, then it will also be a foldable iPad Mini with stylus support and no shortage of good apps for productivity and creation. As the cost comes down, foldable Android devices will fill much the same role as Android tablets - which is very often the role of a cheap video player for children.
It's only foolish to spend thousands on a phone or other toy if that money would otherwise improve your lifestyle. If you've already got several houses, cars, educated kids, good shoes whatever, then you won't miss the cash. The bigger fools have caned their money on coke.
Is that state of affairs right? Well, that's not really a question a phone vendor can be expected to answer.
A few Reg readers have said they've bought the dual-screened Microsoft phone and are pleased with it. I know it doesn't fold, but 1, it gives the same screen real estate as a folding phone, albeit with a bezel down the middle, and 2, its significantly more expensive than a normal phone.
Unlike a folding phone where flexible screens are a relatively immature technology, there is no reason to suspect the Microsoft phone of being particularly fragile.
Obviously there are people who have bought folding phones, but there are a lot of people on this planet for whom a few thousand dollars is pocket change.
The nature of consumer technology is that you see lots of flawed devices in a new category for a few years before you see a decent implementation.
Anyway, your comment is the first suggestion I've seen that is gamers who want an expensive folding phone. Can't think why they would. Seems a better match for folk who need a usuable instance of Excel that fits in a trouser pocket.
Samsung also sell 'de-glitzed' versions of the flagships, such as the S9 E and the S20 FE. Buying those would give Samsung hard sales figures that might affect their future decision making. Just a thought.
I like the LG phones too. The G2 was a great all-rounder and first Android device with support for native playback and output of high res audio. After that, the G range went a bit odd ( like their doomed from the start module system), leaving their less publicised V range to be their sensible offerings (at a time when it was harder to distinguish Android flagships). Then LG had a bit of bad luck with a bootloop issue.
Freeing up screen space by placing signal and battery status bars in line with the front camera was implemented by LG (with a small secondary screen) before the iPhone did the same with the 'notch'.
The LG Wing didn't grab me when it was launched, but the other day it occured to me it could be great for browsing a mix of text and images - if its browser automatically displayed landscape images on the landscape screen whilst the user scrolled through webpage on the lower portrait screen.
Landscape pictures on a portrait phone screen are underwhelming, reading and scrolling text on a landscape phone is a faff.
The 2011 MacBook Pro graphics issue - linked to the introduction of lead free solder - can't occur if the GPU is a part of the SoC.
As others have noted, other firms such as Microsoft's XBOX 360, also suffered faults due to inexperience with lead free solder.
Lead solder can still be used if the components are headed to a military or aerospace application.
I suspect that Rolex and Patek had no choice but the threat of quartz watches back in the 1980s as a learning opportunity, and so did a deep examination of themselves and their market.
(Sidenote, I'm fascinated by mechanical watches, but I don't like their reliance on eventual servicing - somehow it feels wrong, like the reliance of many a quartz watch on a new battery every few years. Both the servicing of mechanical watches and the manufacture of button cells require a level of infrastructure. There is one premium watch brand who released an automatic watch with silicone components, allegedly negating the need for servicing, but at stupid money it isn't designed to shake up the industry)
Apple make a lot of money from their customers. However, their customers derive a lot of benefit from their products. It's not a zero-sum game.
(Speaking as someone who doesn't own any Apple products, but has experienced countless frustrations with other OSs, computers, media players, phones etc. and has been aware enough of Apple's offerings to know that that *particular* frustration wouldn't occur on the equivalent Apple product. I also know that Apple kit can have its own issues that can frustrate. If I were to join you in over-generalising, I'd suggest that non Apple products often have issues that stem from clumsiness, messiness and a lack of care in their design or poor coordination between hardware and software partners, whereas frustration with Apple kit tends to stem from informed and deliberate decisions by Apple )
> I don't see how filling a storeroom with Tamagotchi toys makes sense
It makes sense for the vendor because instilling a sense of scarcity in a consumer's mind can bypass their their rational decision-making process.
As a result they may pay more for your product.
Another benefit to the vendor of scarcity (artificial, or just a result of a sane decision to only tool up so many production lines) is that scarcity can lead to free publicity. Current example: I'm not in the market for a new PlayStation, but I'm still aware that there is a shortage of Sony's new console. Left unchecked, this observation might cause one to think 'Ooh, this new games console must be pretty good if that many people want one'
The 'rational actor' model of economics debunked decades ago, though some more weeding might be required!
Crank up the forges? It depends upon their contracts and options for capacity with their foundry partner. Do bear in mind that automobile production has slowed due to a shortage of semiconductor fabrication capacity - it might be that a car maker will pay more for their chips if the alternative is not being able to sell their cars.
Also, I don't know where the bottleneck is in the PlayStation 5 production, but it is early in the life cycle of this generation of AMD-powered games consoles.
> The only inherent value in crypto, as far as I can tell, is that it is anonymous therefore useful to money launderers
The morality of money 'laundering' depends upon the state in which you live - much like the definition of 'criminal'. When Jewish people wanted to leave Germany in the 1930s, they wanted to take their assets with them. Jewellery was more portable than art, art was sold at a big loss. Some Iranians emigrating after the fall of the Shah converted their assets to heroin - a portable commodity.
More data and analysis required.
Over the long term, you could even see miners taking care of their cards with a view to maximising the resale, and cultivating a good reputation for customer service (i.e, testing used cards before shipping, and refunding or replacing cards that slip through. Such a reputation would allow them to charge a little bit more for their used cards, and to sell their stock more quickly than competitors).
There is enough discussion online about the reliability of used mining cards to suggest it's far from straightforward. Some mining card are overclocked, some are underclocked for efficiency, some just have their VRAM overclocked. Since miners themselves don't benefit from having a broken card, many are careful about good cooling, perhaps more than a teenager trying to maximise their frames per second in Death Kill Auto 3.
Good points. And yes, I think we can all agree that the high energy requirement of Proof of Work cryptocurrencies lies somewhere between inelegant and evil.
What do you suggest in place of gold, proof-of-work cryptocurrencies, seashells or fiat currency?
Do you have any money invested in a Proof-of-Stake cryptocurrency? Okay, I ask this last question tongue in cheek, as a nod to the issue of trust on the internet. (The logic is that anyone who wants the value of a PoS crypto to risecis motive to highlight the 'evil' energy requirements of Proof of Work). :)
> Would you buy a second hand graphics card that had been run at 110% 24x7 for years, the MTBF has been eaten away.
Depends on how much money it's being sold for. And that price is dictated by demand - perceptions of unreliability will lower the price. MTBF is only a single point - it doesn't tell you about the shape of the graph. It might be that a card used continuously will last a long time because it hasn't been subject to thermal cycling. It might be that continuous use shortens the life of its onboard RAM.
More data required.
> Not sure what you mean by "long term". Weeks or months really isn't long term.
Looking at the area under the graph of units sold, weeks often *is* long-term. As he said, making these cards unattractive to numbers *at launch* by hobbling the drivers will free cards up for gamers.
Why is Nvidia favouring gamers over miners, when they all pay the same retail price? Likely because nVidia sees gamers as a more predictable and longer term market (Bitcoin already uneconomic on GPUs, Ethereum eventually transitioning to Proof of Stake) and seeding goodwill is sensible. Pushing miners towards specialised cards won't hurt nVidia, either.
The GTA woman looked like a generic blonde starlet to me, but then I'm better at distinguishing some faces than I am others.
Or perhaps it says something about the Hollywood system that lots of women who look like Lindsay Lohan (or Alicia Silverstone) are cast in films and on TV.
Got a new motherboard around 2008, printed in big letters on the PCB was 'JAPANESE CAPACITORS'. Apparently a few years previously, Korean firms indulged in a bit of industrial espionage for a new type of Japanese capacitor, but they hadn't stolen all the documents. They missed test reports indicating medium term failures, and the tweak to the chemistry that the Japanese had developed to fix the issue. Lots of computers from well-known brands were affected.
A hemispherical wire cage bolted over the PC's vent should do the trick.
Since storage boxes are made cuboid so that they can be stacked next to each without wasting space, is it possible that someone seeing a PC might think it was cuboid for the same reason?
Another possible design failure is giving standard cubicles to people who are evidently expected to deal with lots of paper hardcopy.
I'm not meaning to be overly sympathetic to your users, but expecting people to be not stupid (in the face of all evidence) is itself a form of stupidity (I'm not calling you stupid, I mean computer and office designers or the system in which they work).
The trouble is, an employee isn't going to use a better way unless they know it exists.
Also, if they expect that there's little chance of a useful change made to a UI they might not bother to make suggestions.
Using past analogues from the physical world, it's not always easy to perform time and motion studies on yourself.
> People generally find out by experience what the most economic way of doing their job is.
You'd think, wouldn't you? Yet my colleague switches task by moving the mouse to taskbar instead of Alt+Tab. She uses the menu for Copy and for Paste, too, eschewing keyboard shortcuts. What makes it worse is that her trackpad isn't the best.
Hmmm...
A couple of months ago the literary estate of Iain M Banks announced their decision not to continue with the Amazon-funded television adaptation of Consider Phlebas, after a couple of years of development by director Dennis Kelly ('Utopia' - deep, humorous, uniquely styled, possibly too violent for some).
At the time I'd assumed it was perhaps the estate had reservations over the treatment of Bank's book. Now though, it seems plausible that they feel that Bank's values are at odds with Amazon's practices.
Just a thought.
At the time, there was a spokesman for Amazon's warehouse workers on the news, outraged that the health of he and his colleagues was being put at risk so that people could have dildos delivered to their doors.
I can't remember his exact words, but I was left with the feeling that he wasn't against, say, nappies, books, cleaning supplies etc being picked, packed and delivered if it could be done safely.
There was a public health policy person on Ezra Klein the other day - she was someone who called the major events if the pandemic in advance correctly.
She noted that journalists and policy makers were slow to grasp the epidemic. She noted that Silicon Valley types were fast to grasp the situation because the are used to thinking in terms of exponentials.
A lot of concrete is used not because it is necessary, but because it is easy to use.
There are also ways if using it intelligently, eg, using concrete in pre-cast shapes that can be reused for another building in the future (just as stone blocks or bricks are taken from a disused building to make a new one).
Mr Gates wasn't saying 'don't plant trees', he was saying that planting trees is not a panacea. 'Look at the system intelligently and plant trees if appropriate, but don't rely on just this action' is closer to what he's saying. And he's right.
Peat bogs are a superb carbon sink, but in the past, due to misguided policies, they have been destroyed for plantations of fir trees so someone can claim a carbon credit. This has resulted in more carbon being released than stored by the treesa, and destroyed the area's biodiversity to boot.
> The only why to stop people opposing population control is to perforate the brain.
If you the aim is to reduce the population over time, then surely you need to look at what works? Top-down control doesn't work well (it failed in India) and has unintended consequences. What does work reliably? Access to healthcare, reduced infant mortality, female education, access to birth control.
The problem is that healthcare and education etc requires a level of development that, if done unintelligently (I e, how it has been largely been done by us 'developed' nations') requires a lot of resources. However the good news is that reducing the amount of resources required for such levels of healthcare and education is in part an engineering problem. Engineering problems can be engaged with.
> Every child we "save" from some disease, for smiply no other reason than "Because we can.", is in the end just an extra mouth to feed.
Actually, there is a correlation between good health care and lower birthrates. After all, if you are confident your child will survive to adulthood, you will be happy with a smaller family.
Obviously there are other factors, such as societal structures (do you need your children to care for you in your old age?) and women's ability to access and use contraception - but again, greater development tends to lead to a lower birthrate.
Actually, where I've really noticed colour accuracy recently is when attempting to buy clothes from the internet (physical shops not being open for much of the past year).
I kept wishing that sellers on eBay would photograph their wares against a standard swatch card (since that seems a more realistic approach than hoping the seller calibrates their lighting, camera, work flow etc). As it is, it hopeless trying to judge the colour of something on eBay et al.
@Snake
My firm's marketing and Point of Sale material doesn't require *excellent* colour accuracy, just *good* colour accuracy. We don't tend to use, for example, pictures of people where perhaps strange skin tones would be noticed.
I know enough to know what I don't know, and that I would have some learning to do before working in a different sector such as, art, fashion or food photography.
It's like a carpenter might work in mm, but respects the metal worker working in micro metres.
Regards
[Context: my Dell laptop is about ten years old but still largely for for purpose. However, I've never calibrated the colour of its LCD display. When preparing point of sale artwork to be sent off to be printed, I first check it on my Samsung Galaxy, the office printer, and my colleague's MacBook. The OLED Galaxy phone, MacBook and and printer agree each other and, I've found, with the finished promotional material the printers produce. Close enough for our purposes.]
> demand is forecast to outstrip supply with a relative scarcity of panels due to remain a problem until midsummer.
[Assuming article means display panels] Samsung Dispay is finally getting around to making OLED laptop panels for OEMs. I don't know if this is related to any shortage of conventional laptop panels, or if Samsung has seen that there is a market for pricier laptops. Or maybe Samsung Display have improved their yields of bigger OLED panels.
There's also correlation with rumours of Apple bringing Mini LED displays ( effectively an evolution of the 'local dimming' approach, using thousands of LEDs behind an LCD filter to the extent that it approaches the contrast of OLED but with greater brightness. Further in the future, 'Micro LED' is commonly taken to mean one LED for each RGB LCD pixel, and it could be a contender) to MacBooks soon.
From the perspective of the Olympic sprinter, those fractions of a second matter. But most applications aren't a competition - if these sprinters were put to work delivering letters, the difference in speed between a gold medal champion and a sprinter who (merely!) qualified for the race would most likely go unnoticed.