Re: Coffee chain
Yeah, and they can use those GPU's to roast the beans !
51 posts • joined 17 Jan 2018
Signal is the least bad of the bunch, but still suffers from the lack of interoperability (aka "federation").
(Also, using your own servers - required if you want the best security - might be poorly supported ?)
The XMPP (aka Jabber) protocol is the way to go (using OMEMO for end-to-end encryption).
Notable programs supporting it are Pidgin (desktop),
Jitsi (with the additional feature of cross Windows-*nix video calls),
Conversations (Android, free on F-Droid)
and ChatSecure (IOS).
That's not my experience - I was impressed with the performance of the Asus Transformer I had to use for a week.
(The USB-B mini ports and the bluetooth keyboard were clunky though...)
Your issue seems to be that you expected to be able to run the full bloated MS Office on it (and got baited into it).
For heavy stuff like that you need a Surface Pro - which costs FIVE TIMES MORE !!
(The right tool for the job...)
Yeah, sorry, but this "responsive design" fad is nonsense.
It's just not possible to make a single user interface that works as well on a tiny, fat-fingered smartphone touchscreen, as on a widescreen mouse & keyboard PC.
There are multiple ways around it, like using separate, dedicated programs for browsing news (RSS feeds) and writing comments on mobile.
Or have a very simple, standardized layout that lets the browser do the job... (also, hiding the pictures is certainly a job for the browser's side !)
"However, if you have a working surface and you're tired of windows, the one I was working on managed to run Linux quite well, with no driver issues."
Which one? I was told that Ubuntu drivers were a disaster for anything "touch-related" on the Surface Pro 3... which is annoying because I have an ever harder time to justify running Windows to myself.
"I bet a good percentage is spent on, say, displaying white pixels on computers for Google's home page.
On World Eco day they once went black and estimated the savings worldwide and it was a significant chunk."
This doesn't make sense, most displays are LCDs these days, and those don't consume less by displaying black... (because the fluorescent/LED backlight is always on)
"Maybe some of those Youtubers should try doing what BigClive does: don't rely on Youtube ads to cover costs, explain why it works better for him and his viewers if the money needed to buy coffee and cookies doesn't pass through Youtube, and hope it works out OK. Everybody here is aware of BigClive, right?
Or maybe more prospective YouTube starlets could repeat what some lowlife have achieved with the help of the Daily Mail this week: use the Daily Mail's 'news' coveragel to provide free advertising of dubious videos set in dodgy London shopping malls (and worse), thereby enabling this stuff to reach an audience that wouldn't normally even know such videos existed ("Barbara Streisand effect" gone mad)."
Exactly. I've never heard of BigClive. I *have* heard of a YouTuber disrespecting a corpse.
"Whuffie [a fictional reputation-based currency] has all the problems of money, and then a bunch more that are unique to it. In Down and Out in the Magic Kingdom, we see how Whuffie – despite its claims to being ‘‘meritocratic’’ – ends up pooling up around sociopathic jerks who know how to flatter, cajole, or terrorize their way to the top. Once you have a lot of Whuffie – once a lot of people hold you to be reputable – other people bend over backwards to give you opportunities to do things that make you even more reputable, putting you in a position where you can speechify, lead, drive the golden spike, and generally take credit for everything that goes well, while blaming all the screw-ups on lesser mortals."
I feel that there's many parallels to hoarding YouTube subscribers, "likes", and view-based income trough advertising...
This whole discussion is so funny to look at now that I've read this :
"The Borderers are [...] a bunch of people who lived on (both sides of) the Scottish-English border in the late 1600s.
the Scottish-English border was terrible
Life consisted of farming the lands of whichever brutal warlord had the top hand today, followed by being called to fight for him on short notice, followed by a grisly death. The border people dealt with it as best they could, and developed a culture marked by extreme levels of clannishness, xenophobia, drunkenness, stubbornness, and violence.
The English kings finally [...] decided to make the region economically productive, which meant “squeeze every cent out of the poor Borderers, in the hopes of either getting lots of money from them or else forcing them to go elsewhere and become somebody else’s problem”.
Many of the Borderers fled to Ulster in Ireland
the Ulsterites started worrying that the Borderer cure was worse than the Irish Catholic disease. So the Borderers started getting kicked out of Ulster too, one thing led to another, and eventually 250,000 of these people ended up in America.
So these people showed up on the door of the American colonies, and the American colonies collectively took one look at them and said “nope”.
Except, of course, the Quakers. The Quakers talked among themselves and decided that these people were also Children Of God, and so they should demonstrate Brotherly Love by taking them in. They tried that for a couple of years, and then they questioned their life choices and also said “nope”, and they told the Borderers that Philadelphia and the Delaware Valley were actually kind of full right now but there was lots of unoccupied land in Western Pennsylvania, and the Appalachian Mountains were very pretty at this time of year, so why didn’t they head out that way as fast as it was physically possible to go?
At the time, the Appalachians were kind of the booby prize of American colonization: hard to farm, hard to travel through, and exposed to hostile Indians. The Borderers fell in love with them. They came from a pretty marginal and unproductive territory themselves, and the Appalachians were far away from everybody and full of fun Indians to fight. Soon the Appalachian strategy became the accepted response to Borderer immigration and was taken up from Pennsylvania in the north to the Carolinas in the South
The Borderers really liked America – unsurprising given where they came from – and started identifying as American earlier and more fiercely than any of the other settlers who had come before.
They also also played a disproportionate role in westward expansion. After the Revolution, America made an almost literal 180 degree turn and the “backcountry” became the “frontier”. It was the Borderers who were happiest going off into the wilderness and fighting Indians
This was a big part of the reason the Wild West was so wild compared to, say, Minnesota (also a frontier inhabited by lots of Indians, but settled by Northerners and Germans) and why it inherited seemingly Gaelic traditions like cattle rustling.
Their conception of liberty has also survived and shaped modern American politics: it seems essentially to be the modern libertarian/Republican version of freedom from government interference, especially if phrased as “get the hell off my land”, and especially especially if phrased that way through clenched teeth while pointing a shotgun at the offending party.
But Albion’s Seed points out that the Borderers were uniquely likely to identify as just “American” and deliberately forgot their past ancestry as fast as they could. Meanwhile, when the census asks an ethnicity question about where your ancestors came from, every year some people will stubbornly ignore the point of the question and put down “America” (no, this does not track the distribution of Native American population)
And from another website :
Behind this dynamic is the most enduring of the cultural divides in American society, the divide between the urban enclaves of the nation’s periphery—prosperous, irreligious, and culturally dependent on European models—and the relatively impoverished hinterlands, with their loyalty to Protestant religiosity and American folk culture. That divide came into being long before the Revolutionary War and it’s been a massive influence on our culture and politics ever since.
I think the same author mentioned at some point (but I cannot find a reference to it now), that in the USA,
in opposition to Borderers-Cavaliers/"Republicans"/"Conservatives"/"Red States",
the Puritans-Quakers/"Democrats"/"Liberals"/"BlueStates" were often uncomfortable with calling themselves "Americans"!
(So, this discussion, while superficially about definitions, is more likely to be actually about group signalling!)
Not "period, full stop". Smartphones were niche too. Then Apple got in the game and dazzled everyone with their marketing (remember how the first iPhone, unlike its competitors, didn't even have third party apps, copy/paste and 3G at launch?).
With better technology, usability and marketing AR/MR/VR is very likely to become much more than niche.
Are Windows Mixed Reality requirements that much higher than Oculus Rift ones?
Because I have a cheap Bulldozer CPU with a Radeon 470 (that I thankfully bought just before the mining craze), and 95% of games run well !
(And the remaining 5% run so poorly, with similar "required minimums", that I'm suspecting that their developers just didn't bothered to test their games with Bulldozer CPUs!)
SteamVR benchmark even says that my PC is "VR Ready!" : https://steamcommunity.com/sharedfiles/filedetails/?id=1116639906
I still have one in my drawer and it does have a "cute little door". Though I personally would probably use the term "nifty" rather than "cute" :
That phone is so incredibly stylish for when it was released I probably could still sell it...
"We’re sorry to leave the direct-to-consumer keyboard business, but this change is necessary to allow us to concentrate on developing our AI solutions for sale directly to businesses,” wrote Nuance."
Translation : "We've slurped all the personal data we need so that we can now sell it for a hefty sum. So long, suckers!" /jk
More seriously, isn't this a bit like Google stopping to read our e-mails in gmail, because they have more than enough information to train their AIs, and that's where they expect all the money will be in the future ?
That's still a middle man.
However, it's a much better middle man than advertising agencies like Google / Facebook.
And it already exists, from an (un?)likely union of Piratebay founders and an Ad-blocker,
I present to you :
Flattr 2.0 !
(They only thing they are lacking, besides publicity, is a way to make micropayments (to small creators) cheap (like Patreon recently found out that it was problematic), that can probably be solved either by waiting for a minimum amount to accumulate before an actual transaction - or possibly by using cryptocurrencies... I think the Brave browser is trying to do something like that?)
"This is the problem, Mr Milner," Ian Lucas said to Facebook's UK policy manager Simon Milner. "You have everything. You have all that information; we have none of it because we can't see it."
I'm seeing an "easy" and tempting solution here :
"Give us [the government] this information, GAFAM, and we'll leave you alone!"
Which would be a disaster :
it's one thing to have your personal data siphoned off by a transnational corporation that DOESN'T CARE about you,
it's very much another thing that this data would be available to a government that might or might not be still a democratic one in a few decades, that "CARES" about you, and can use force against you.
First there's Flixxo :
a peer-to-peer decentralized video-sharing network that wants to pay the people that :
- create the videos
- host the videos
- share the links to these videos
- watch advertising
using a cryptotoken
- that the viewers will spend.
*(disclaimer : I helped to fund Flixxo trough their ICO)*
However, if you're allergic to cryptocurrencies and would rather like to get rid of advertising on the Internet, there's Flattr 2.0 :
a browser extension that records what websites you visit and how much you "engage" with them,
and then at each end of the month gives (most of) the money that you send to them to the creators that have registered their websites,
according to how much you interacted with these websites during that month.
(Though I'm afraid that the underlying issue,
with YouTube and Patreon being less and less willing to "monetize" small creators due to payment fees being too high :
will affect Flattr in the same manner - cryptocurrencies might be the only solution to the final micropayment to the creator...)
"For both eyes combined (binocular) visual field is 135° vertical and 200° horizontal."
the high resolution area of the eye doesn't seem to have have a clear cut-off... (until the blind spot at ~15° ?) :
I'm pretty sure that you're wrong :
You might confuse it with the "Asynchronous SpaceWarp" visual trickery that the Rift does that allows it to cut the framerate pushed by the graphic card in half to 45 Hz with hardly any noticeable difference?
1.) Remember that putting our visual bandwidth in terms of bits/second involves making not only assumptions about resolution, but also about colors, brightness, and framerate too !
(I'm assuming that the quoted "74 gigabytes of visual data are available to us each second" are assuming the sRGB color space and 90 Hz ? Source ?)
2.) Japan's Broadcasting Corporation (NHK)'s research "claimed the tests showed 310 pixels/degree are needed for an image to reach the limit for human resolution" (which is a lot more than the quoted 120 pixels /degree)
(Though you can indeed see from the graph that you start indeed to get into diminishing returns over 80 pixels / degree - which also corresponds to the average 20/15 vision - but then it *also* looks like that the Nyquist sampling requirement doubles that number, and you end up with a minimum of the double : 160 pixels / degree.)
That Samsung 850ppi display, put in a Oculus CV1 would result in ~23 pixels per degree :
The US Air Force hypothetical 10,300ppi display would therefore result in 282 pixels per degree ? (notice it's close to NHK's research result!)
That's weird because I'm short-sighted (slightly, -0.5 diopters), and my (Oculus Rift, CV1) experience is noticeably degraded without glasses -
(which are also annoying, start to bother me over 30 minutes in VR, and risk damaging the (non-removable, extremely easy to scratch) lenses when removing the headset - I'll get lens inserts when I get around to it)
I've assumed that's because the lenses are actually to angle the light rays to make like the picture is coming from far away...
(but I also have light astigmatism, maybe that's a factor too?)
You don't need an overly expensive PC for VR :
I'm having good enough performance for ~90% of the use cases with a
AMD (Bulldozer) FX-8320E (£100) (the 8 cores seem to help a lot)
Radeon RX 470 (£175)
(and I blame most of the issues with the remaining 10% at the developers not bothering to test their software with Bulldozer-line CPUs)
Biting the hand that feeds IT © 1998–2019