126 posts • joined 22 Sep 2007
Re: I smiled
Except that is true. Amazon only sells products if consumers are getting a deal that makes the hassle of shipping and delayed gratification worthwhile. If I can head down to my local bricks and mortar and get the same product for nearly the same price, I'm satisfied now. Amazon can either use volume to make up for low margin or its ubiquity and expansive inventory as leverage on distributors.
At the end of the day, it is for the consumers... because without them, Amazon wouldn't have a business.
It's not that hard to restore...
If you right click in the space next to the tab, you can check the "Menu Bar" option and restore that Win 3.1-era UI element.
There is a point...
When you're claims are rejected enough times that the only recourse is the Supreme Court. Typically the other players in your market will try to keep you from going that far, as it might endanger lower court rulings in other circuits that are favorable.
In an oddity of the US Judicial system, if a district or circuit court (covers a number of states and there are 13 total) rules one way in a case, but a different court in another part of the country rules another (or takes a different tack to arrive at the same conclusion) you end up with different precedents. As the Supreme Court cannot overrule or invalidate those rulings without a party petitioning the court, this creates areas where certain legal action is favored. As each court is largely independent and able to have their own process, it can mean that some plaintiffs prefer certain settings.
If, however, you continue to sue and appeal, it will eventually reach the Supreme Court. If there is a lack of coherence among the various circuits, that tends to prod that old bear into taking a case. It's at that point that a point of no return is reached. You can't settle out of court once the Supreme Court takes on a case. So if in this case, Marvell tries to rely on a trick or procedure that is typically beneficial to IP owners in the hardware space, and it's found to be lacking in front of the Supreme Court, suddenly you aren't invited to the Silicon Valley Christmas parties, and you get a lot of mail returned as undeliverable or marked "Return to Sender." Just see the current software patent case that the Supreme Court entertained the other day. The list of software and IP holders on both sides is rather extensive. Someone is going to lose, and it won't be just a "Reserved for low-emission vehicles" space at the local Y....
I believe you're thinking of EndNote, unless there is a function of OneNote that I'm not aware of. It was the same mistake I made after I picked up Office Pro through HUP. Then my wife started a Master's program last summer, mentioned EndNote, and the repressed memories of my college days bubbled up through the alcohol haze.
Too late for me...
Through the home-use program, I picked up Office Pro for $10. While most of my OneNotery has happened at work where it's nice to link meetings in Outlook to the agenda and whatnot, I have found home use for it. Making a checklist is easy and it's pretty decent at helping me collate stuff around the house. Not quite as useful, but still worth the $10 for home use.
And yes, I will invalidate this post by saying I like the Ribbon.
Re: 6 m
Does that account for the crust rebound when all the weight of that ice disappears? Not trying to be snarky, I just know that the Great Lakes, formed by glaciers, is rebounding still thousands of years later.
So what about the SlingBox or other remote-viewing item that allows you to take what you've purchased and rebroadcast it to another device while outside (or even inside, I suppose) your residence? And what if you don't have a cable connection and rely on an antenna to pick up local broadcasting (and therefore free after equipment costs)? If I SlingBox my local network affiliate across the nation while I'm traveling, does that mean I'm rebroadcasting? We've already established here in the US it is perfectly legal to record broadcast television and watch it later or over and over for private use.
The problem you and others are demonstrating is a belief that there is some kind of iron-clad law or settled case law that defines all these vague and possibly conflicting definitions when the reality that there are none. The Act cited in the case was designed in the 1970s when cable television companies were a new thing. I'm sure if one wanted to go back and look, you would see the fingerprints of NBC, ABC, and CBS all over as they worked to protect their place in the broadcast world.
When ABC tried to take Aereo to court using the same tactic, the Second Circuit Court found that the copy made by the user using Aereo's hardware was not a public performance as the copy was limited to the user and the user's account; therefore it does not violate the Transmit Clause of the 1976 copyright act. This was after the court agreed that the legislative act in 1976 was specifically created to prevent cable companies from capturing a single over-the-air feed and rebroadcasting it to subscribers' homes. Perhaps also delving a bit too far into the technical minutia, the court also noted that if a Aereo subscriber picked a show and clicked Watch, there was a 5-10 second delay between the actual OTA feed and what the user was watching, which meant it was a copy (as it was technically being saved to a storage device) for private use, a key distinction when looking at the Fair Use ruling.
So it's not a failing of reading comprehension, it's the reality that there are numerous interpretations of the laws, case law, and technical specifications of how the service works. As I and probably most other people here (including the author) are not lawyers, our piecemeal interpretations of barely read acts and rulings is as useful as pissing in the wind.
I think we need to clarify a few things...
First, Aereo only offers to provide this service to you if you live in the same general broadcast area the signal is originating from. So if I live in Dallas (god forbid), I can't stream the local broadcasts from Chicago. Now, if I lived in Chicago and was traveling to Dallas, I could keep up with my local news (or the real issue, the various Chicago-area broadcasts of an NFL or MLB sporting event) by streaming them online to my laptop or tablet.
This is really a way for folks to get a clear signal of their locally-broadcast networks, which again, are broadcast for free (ad-supported, really) to anyone in the area with an antenna that can pick up the signal. And if you live in a very built up area or in an area that is a dead-zone, it can be hard to get that signal. And in today's world of digital signal, you can't watch a noisy signal and still see things with a snowy picture; it's all or nothing.
As far as ads, there is no difference between watching your local NBC affiliate over the internet, air, cable, or time-shifted. Today's ratings are based on numerous surveys that capture both in the moment viewing as well as same-day viewing (accounting for time-shifting).
I imagine that Judge Kimball's ruling will just become one more data point used by the Supreme Court when ruling on the matter (as they have already taken up the case). The federal appeals court where Aereo is based, New York, ruled in favor of Aereo, which resulted in the appeal to the Supreme Court. I believe other courts ruled in favor or Aereo as well, and this is the first loss they've suffered.
Re: Blame @ Charles Manning
The analogy is misguided. It relies on a genetic predisposition that could only be realized with copious amounts of training and resource investment and then it used as some kind of benchmark by which we should all somehow measure ourselves against.
Also, your tilt at the mythical equality champion falls a bit flat once people realize that only a small minority subscribe to a Marxist utopia or other similar paradigm. The reality is that equality, even in the loosest of definition, doesn't exist today, even in the richest nations. If it did exist, outside of the equal opportunity we share to have our mortal remains returned to the earth, many of the issues that plague us today might be less prominent.
The problem that a Tom Perkins or other 1% self-made martyr fails to understand is that we don't begin in the same starting blocks. In the 100m sprint of life, to rework your sprinter analogy, most of the 1% began somewhere around or after the 50m mark. Almost all of the richest came from at least a median household, and few who begin in a median household end up back at the starting line. It's an imperfect analogy, as it's a race few ever "complete" before they die, and people run backwards for a variety of reasons. But he's complaining that others are pointing out that he's further along and they think he got an unfair start, they were unduly hampered, etc.
That's not to say he didn't earn his wealth, but who's to say that if he were placed in different circumstances as a child, he would have ever attained what he did? The statistics say he likely wouldn't have advanced as far, and he might have just been a really dedicated coal miner or assembly line worked. Luck of the draw is more to do with the circumstances of your birth than the DNA you acquired during conception.
And finally, we really need to set aside the whole notion that your worth is measured only by what someone else is willing to pay you for a specific set of skills that may have nothing to do with survival or flourishing. We all have basic needs to meet, and if those are met, I don't see how it is anyone's business what I do after that.
I hope that a bit more bipartisan work towards revamping the idea of welfare (individual and corporate) into a basic income or guaranteed income notion, much like what some Swiss are trying to push. Everyone of a certain age and legal status (commonly legal permanent residents and citizens) are given a stipend each year that covers the costs of basic living, and do it as a cash benefit. Whatever that person wants to do with it, they can. This idea has proponents on both the right and left, because it replaces other inefficient forms of welfare, reduces bureaucratic overhead, is easy to implement, and since everyone is getting the same check, no one can whine like a jilted 5 year old on the playground about not getting their fair share. Tax based on income earned beyond that stipend, and keep it roughly equal up the chain. You could also do away with minimum wages or other wage supports and people can move wherever they want for a job without fear of losing benefits (a huge problem in the US today).
Re: Fueled by sugar, but fuel is not flammable...
Sugars are quite flammable. In 2008, a sugar plant in Georgia (state) suffered a catastrophic dust explosion that was caused by ignition of the sugar dust in the air. 14 people were killed and 40 injured; the fire burned at around 4,000F (compared to the usual 1,000F to 1,800F a typical building fire sits at).
Maltodextrin is different than surcose (refined sugar), but they both carry the same dust explosion issue. In a battery, where is is unlikely to be in a dust-like state, it won't be explosive and flammability might be limited, especially if in an aqueous solution. Nevertheless, sugars can burn, and burn hot.
Re: An important point...
And, as is the wont of living in America, that is a product that isn't available in my Top-15 metro area. Go figure.
Re: An important point...
I would also like to add that while I use the phrase cable TV, it's broadband. And to be clear, this is last mile stuff. The truth is that cable TV franchise agreements were how the copper, then fiber, networks were built out that allowed cable TV providers to get into the broadband business. DSL, because of common carrier and the use of those telephony assets, did provide competition to cable and multiple providers in the same region. But I believe there were some rulings in the late 90s or early 00s that allowed the owner of those phone lines to charge whatever they wanted to those DSL ISPs for maintenance and build-out, and the telephone companies certainly did that. So consolidation happened there as well. Hence the two options: Comcast (cable) and CenturyLink (DSL).
An important point...
It was mentioned that Americans don't have much choice in the ISP department. In a very large metro area, I have two choices for broadband: Comcast and CenturyLink. I've had both, and don't much care for the prices or customer service of either. In the last five years, both have moved to 2 year contracts with large ETFs; even with the contracts, they have the ability to raise prices above and beyond the usual reset to the regular price after a 6 month promotional period ends. If one began degrading my access to their competitors (in Comcast's, damn near anyone not part of NBC), I have to pay a large ETF, move to CenturyLink, and hope they aren't retaliating against Comcast. And if both decide that Google or Netflix has gotten too cozy in the content provider realm, I'm SOL. So it's not, as seemingly asserted throughout the article, that Americans are to dumb to notice; we just don't have the ability to anything about it.
And in case someone asks, the reason we have so few options is that a while back, Congress bought into idea that data delivery systems are expensive and require guarantees of usage to make it worth a service provider's money to build out. So every local city was able (required, really) to promise single-service provider access to their residents. These "franchise agreements" were fine when there were as many cable TV providers as metro areas. But consolidation in the late 80s and 90s meant that are are only a couple of national players; the handful of local or regional providers that remain are in the rural areas, as the Comcasts of the world deem those places too expensive to bother with. So we are left with wonderful results like a carriage-fee dispute for ABC or NBC (national networks with many cable-type channels) can leave whole regions (like the small area of NY, Boston, and DC) without one of the networks that make up 25% of the TV watched.
Even when the end-users might have some control...
IT comes in an complicates things because that's how it's always been done. For example, a current project I'm working on was RFP'd to external vendors because the IT group said, "Nope, we don't do that kind of custom work anymore." So with their blessing, a vendor was chosen who's solution had a key feature of allowing the business users the ability to create their own basic functionality (it's really simple stuff) within the framework that was created by Vendor/IT/Biz collaboration, and enhanced in the future through a typical development process. Just this week, one of the software architects went on a long presentation about how all changes, even to that business-controlled functionality, should be married to the 9 month development cycle for full IT development work, QA, IT version control, tollgates, etc. Never mind that the solution is replacing system that is business-driven and working, but running on an EOL'd platform, or that the business is on a 30-90 day TAT for new requests. At least one sane IT voice said, "Well, having a monthly release separate from the IT calendar would be wise, just so that the Help Desk knows what's coming if there is a problem."
I've been in both business and IT. I've seen complex business processes managed out of Excel spreadsheets who's original author left years ago, leaving the business with a "suck it and see" change management process. I've also been part of IT groups who think even document templates are an IT-managed resource, and woe to the business team who thinks they can run an end-around by creating their own (this really happened... I was floored that IT cared that much about a Word doc that wasn't part of any IT process). IT is necessary, as many of the skill sets necessary in business are not useful when it comes to development. But the fact remains that IT is not always agile enough to adapt to market changes, sometimes business process breaks a system regardless of intentions, and that just because something has an IC embedded in it somewhere does not make it an IT-owned asset.
Yes, I am saying exactly that. There are a few posts on his site relating to his conversations with smaller issuers and what they do when there is a breach like this. And they do it not to get it off the market but to get list of the impacted cards.
But while a copy might be kept, they have little value if they don't work. So I'm sure these theives know that some banks buy their customers back, so it ends up being a nice little extortion racket. But even so, the underground market had the books sorted by zip code, since nothing flags a transaction like it being 2,000 miles away just a few hours after the legitimate card holder bought gas by home. Being used in multiple locations even close to home at or near the same time is another simple flag, so they have an incentive to actually only sell a book once. It's no different than merchants of legitimate goods; if you sell crap wares, you don't have a lot of repeat business and eventually you have a lot of product that is going bad fast (and even faster once a breach is reported in the press).
Cloned credit cards are only useful in physical stores, as the CCV2 (the three-digit code on the back) is not required for swiped transactions. The other CCV is part of the same magnetic track that was stolen, but it is useless if your try to buy from Amazon. As credit card companies are required to provide fraud protection, the damage to a customer is minimal. Just check online for odd transactions, call the bank, file a report, and wait. Sometimes it requires a bit more legwork, but for the most part, it's no more than an annoyance. Most banks today recognize that if they shoulder the cost of fraud, they need robust systems on their side to detect fraud. In fact, in some cases it's gotten too good and results in declined swipes because you are traveling or buying something well outside your normal transaction history.
Debit cards are different and more secure. Whoever Target entered into a debit card processing agreement with, they agreed on an encryption standard for the PIN, as that is a "stronger" form of identity validation and probably is protected by law. Target is probably on the hook with the various issuers if they fail to encrypt that information, and Target does with one of the strongest options available in a commercial setting.
In the end, it sucks to change your PINs (just in case) and pay a bit closer attention to your cards. But Kerbs on Security already had a story of smaller banks going out to the credit detail shops online and buying back their customers information. At a cost of $25 are card, it's not cheap, but it's probably cheaper than settling fraudulent charges with merchants and consumers, and it gets you an exact idea of how many accounts were compromised and require reissuing.
Re: Now over 110M
Actually, it's called sensational reporting. Some bright bulb in the copy editing room remembered that when you add two numbers together, you get a larger number. The truth is somewhere between 70 and 110 million. The likelihood that there is no overlap between the credit card transaction theft and the customer database theft approaches zero. And given the brand loyalty exhibited by Target shoppers (at least until recently), many of those 40 million who suffered credit card detail theft are also in the Target customer database that was compromised.
I don't think this is a repository of personal information...
Based on the design schematics that were published when this originally blew up, there is little personal information stored within the website architecture itself. Rather, it relies on taking the information entered by a user and makes numerous calls to other, non-public sources that are outside of the website itself. Presumably, that would require knowing a person's information if you wanted to plumb the depths of what the government has on you.
Of course, if the profiles that users have to set up are in fact stored within the public-facing system and can be accessed through the tried-and-true methods of SQL injection and the like, that's a problem. However, evidence to this point suggests such access does not exist (as pointed out by another commenter, the existence of SQL in the autocomplete only shows it's a frequently searched term by users, not a welcome mat with a key underneath). Executing a call to a separate system typically isn't that easy and would require a lot more knowledge of the design of the system as opposed to script-kiddies with too much time on their hands. A DDoS attack is still the mostly likely (and most damaging, from a PR standpoint) attack vector.
Part of the problem...
Is less pirates and more boneheaded ordinance. As written, a candidate cannot be eliminated from consideration until all other candidates before them are verified to have more votes. By ordinance, all candidates who are mathematically impossible to be elected must be eliminated at once. That's not too bad, as it's pretty easy to count all the first, second, and third choice vote for a candidate and say, "Yup, they can't exceed even these candidates first-place votes." And in the case of Minneapolis for this specific election, all other candidates first, second, and third choices could not exceed the eventual winner's first-place votes. But the problem is later in the ordinance:
Mathematically impossible to be elected means either:
(1) The candidate could never win because his or her current vote total plus all votes that could possibly be transferred to him or her in future rounds (from candidates with fewer votes, tied candidates, surplus votes, and from undeclared write-in candidates) would not be enough to equal or surpass the candidate with the next higher current vote total; or
(2) The candidate has a lower current vote total than a candidate who is described by (1).
So, instead of just saying, "Yup, Betty's first place votes exceed the combined first, second, and third place votes of all other candidates", a condition which was known very early on, following the ordinance meant they had to manually disqualify, via clause (2), that each candidate did not get more than those in front of them, because of the higher level requirement that all candidates must be eliminated at once.
Chalk one up for over-analyzing all possible outcomes and trying to define all possible terms, even the ones that seem rather unambiguous.
And for those who care, the winner was finally declared a a couple of hours ago.
"During routine website maintenance, a home page prototype was accidentally moved to the actual site. As with any mistake in testing, engineers noticed the error and quickly brought the site back to its normal
functionfiction," Jeff Misenti, chief digital officer at Fox News, told The Register in a statement.
I will say, the teachers and social workers I know tend to the happiest about their career, if not their compensation. It says a lot about both the UK and US that we laud those who sit in a cube, take orders, and do the 21st century equivalent of the assembly line work (myself included), but we gleefully take the piss on those who dare to do something they enjoy for less money, less respect, and even less safety. For example my wife, a teacher, has been physically and verbally assaulted doing her job, both by students and parents, while the worse I have to worry about in my cube farm is a pissy email from some colleague who thinks red, bolded font is threatening. I've also never been barricaded in my cube by coworkers who decided to stage a riot, or had people intentionally distract me so a partner can sneak into my cube and steal from my backpack.
Perhaps as a whole, we should really rethink what we venerate and why those who are unhappy about how little control they have over their work situation are somehow "better" than those who take risks.
Obviously, I can only go on anecdotal evidence. Yes, Lewis really took the piss with the published work, and I cared enough, I might even read through the actual paper. But as Can't think of anything witty... said, Psychology is not an easy subject, and it's no more a soft science than Computer Science is the same as IT. Accounting for differences I've seen between the US and the UK, psychology tends to pad the early courses with the Freud and Skinner, because they are often used as classes for general education credits and it's easier than trying to learn about action potentials, anatomy, and neurochemicals.
My experience is that most psychology professors laugh, outwardly even, at the crap produced in the early to mid 20th century that is passed off as psychobabble in media today. It's only those who want to major in psychology that get introduced to the neuroscience, psychophysiology, chemistry, and the like. Once you are there, the first thing you learn is statistics... real statistics. And not just how to use Minitab, but the logic and rules behind the theory of statistics. You also learn the same research methods found in medicine and science, like lab procedures, ethics, etc. It's all there. And it's not easy.
And personally, I work in IT, or would if my current company hadn't worked hard to keep business and system analyst hybrids on the business payroll (I'm sure it makes the accounting easier). My math and analytical background from psychology has opened more doors than if I had signed up to learn programming languages in college (in retrospect, it would have opened more doors to have at least learned some along with my degree, even if I didn't want to go through those doors right away). Not to mention the stigma in interviews is not nearly as bad as what IT folks experience (I'm assumed to have people skills... ha!)
I also love the bitter grapes that people have over "working hard" in college, while assuming others did not. I knew CS majors and math majors who were just as likely to sleep through class, get drunk every night, and still stumble to the finish line and get a degree. Psychology had them as well, and if my brother's description of his engineering university is anything to go by, they probably lost more engineers to drowning in their own vomit than academics. It's what college students do, and some can handle it, others cannot. To belittle an entire field of study because you don't understand it is rather ballsy, especially when there is nothing other than your own bias and superiority complex to back it up.
Re: @Tom 13
Most of what you are said was subjective, conjecture, gross exaggeration, or outright distortion. And then you delved into the world of paranoia and tin foil hats with the spying and the IRS, followed up with a healthy dose of "woe is me” martyrdom. Bush spied, Obama spied. The IRS went after crock-o-crap groups who filed like mad in 2010 to exploit a loophole, and those groups got mad they were caught being utter cocks. The fact that liberal groups also were checked at roughly the same rate (and rejected, something the right can't claim happened) doesn't register in your mind, because, you know, tin foil.
I’m not looking to engage you in discussion, mostly because it would be fruitless and filled with your own personal rants about particular grievances you or the website you couched your talking point from have against the government, people who work for the government, people who used the government, or people who might have six degrees of separation from government. Just wanted to make sure you and others knew how off-base and completely meritless you “responses” to another person’s post were.
Re: They did add a pony
SNAP today is limited to "healthy" items. It's not just a cash benefit; there are very specific items that qualify. For example, many fruit juices don't qualify, because of the massive amounts of sugar they have for very little nutritional content.
Also, the reduction that is going into effect after Oct 1 is happening because it was a temporary increase as part of the stimulus bill in 2009. What Republicans want to do is remove a provision from a 1996 bill that allowed states the leeway to suspend a the 3 months of SNAP in 36 months for able-bodied, unemployed, childless adults. In times of economic duress, states are allowed to suspend the requirement that such folks get jobs or go to job-training programs if they want to continue in SNAP. Many states have suspended that requirement because of the economy. Republicans want to do away with that, in addition to reducing the already scheduled to be reduced SNAP benefit (which is about $4/day/person for those with the lowest income... not exactly Oscar-style filet mignon... or even McDonalds-style "food".)
Re: The Republican Dream
The "pox on both houses" sentiment is valid only if you want to take the most cynical look at politics. Not to say that you are of this ilk, but time and again, attitude and behavior research shows that the so-called "indepentents" and "both parties are the same" folk are ones who talk big but don't tend to know a lot about the political process, party platforms, or even basic information like their Congressional Representative. To put it lightly, such attitudes are the provence of folk who don't care but want to pretend they have a good reason not to care.
You could be an exception, but in that case I would question your knowledge of what the core principals of each party are, the various splinter groups within each party, the regional differences that can explain a lot more about the propensity of a Representative or Senator to vote in a way that might seem counter-intuitive or like they are "in the pocket" of lobbiests.
If you are trying to come up with another way to say, "I don't like either party because they don't represent my views," that's fine. But then you should at least have an idea of what other organizations out there represent you. As I tell others who complain about the current process: It's fine if you don't like it, it's fine if you feel left out. But if you want to be taken seriously, stop hand-waving the entire thing away and using it as an excuse to be apathetic and apolitical. If you truly care about these issues, you would expend some of that energy finding like-minded folks, something that is easier today than finding a lobbiest in Congress. Stop complaining and do something or stop pretending you care.
I hope you are are aware of the whole vsync thing, so the 60Hz screen limited to 60 fps, while the 120Hz screen can hit 120fps, assuming it has a video card beefy enough to drive it. And I believe those were first-person shooters, so games that have other things to account for, like perceived and real latency and lag between game action and user input.
Human vision is more attuned to movement than static images. A flickering light is really just a static image, where as a video will likely have movement and keep the eyes and brain on high-alert. You should try staring at yourself in a mirror for a while with a single point of focus, and see what happens. Here's a hint: Your brain gets bored and plays games with itself....
Re: "Modern", "streamlined", what?
My honest to goodness experience with web UI redesigns in various companies in the past is that they were unintended outcomes of backend changes. Be it because of a vendor change, an owner change, or some other change, the new hardware and/or software that served the business and warehoused the data turned out to be incompatible with the portal or other web front-end that was currently deployed. So invariably, it required a redesign, which was usually an excuse to troll through hundreds and thousands of comments, emails, IMs, tickets, and other dusty relics to determine what users wanted.
Usability studies, UX consultants, UI designers, and the like were typically left out until the very end, usually after the architecture work had been completed, meaning that the UI was a complete and total afterthought. So the DB would be set up with internal tools in mind, rather than serving the UI. And I wish I could say that these redesigns were for deployments that had minimal customer interface or just was an alternative to direct queries on the DB, but no. These were the public face of billion dollar companies or web-based internal applications that served the foundation the company was built on.
Of course, there are always excuses to be made for botched implementation, but it will probably be some poor schlub or hastily press-ganged consultant who's forced to fall on their sword to protect the VP who dreamed up the pig's breakfast. If they are lucky, that act of protection will result in a new appointment elsewhere as part of the compensation....
I always thought...
That Yahoo! Groups were the place for cash-strapped high school students to troll for pron... years ago. I'm surprised as you to find out they both still exist and apparently have enough users to get upset about it. You'd think they'd have migrated to other free pron sites by now.
Re: I have to wonder...
Once again, you miss the point. At no point did I say, "Government, monitor away!" In fact, my original post and my follow up detail that the problem is that someone like you sits here and whines about what the big bad government is doing, yet giving a free pass to non-government entities who are doing much the same. And if the PRISM revelations are anything to go by, that unchecked data collection and monitoring by corporations just gives governments a one-stop shop to pick up a dossier on anyone they please.
So one more time: Those who froth and foam at the mouth about big bad government collecting data but then turning a blind eye to commercial collection of the very same data completely miss the majority of risks when they go on about privacy and freedom.
Re: I have to wonder...
Really? You don't think peoples lives can be ruined just as badly though identity theft? People don't get jobs because of bad credit scores. Is having no income or low income worse than the chance you might be hassled over an off-color joke about the government? Or perhaps through the leaking of intellectual property that causes monetary loss? Different laws in different lands, I suppose, but the reality is that the risk of finding yourself on a Gitmo holiday is less than being struck by lightning, and being there because of mistaken identity is much less. You are more likely to be shot to death by the police while trying to board a subway. Heck, you're more likely to be on the business end of a Predator strike because you had the unfortunate luck of living in a village or outpost that a target of value decided to hide in. That doesn't make it right, but that's the reality.
And do you forget the number of companies that The Register has had articles on who routinely use Facebook and other social networking sites to spy on prospective and current employees? And let's not forget the almost daily (even now) stories of companies who have had their websites compromised through a simple SQL injection, spear-phishing attack, or other security breach that allowed the perps to wander away with account information that could contain sensitive information or be used to procure additional sensitive information elsewhere.
The simple fact is that in risk analysis, people who are worried about the personal consequences of the government having information on them are missing the real risks. These are the same people who think planes are the most dangerous form of transportation and children are always snatched by strangers, yet think nothing of getting behind the wheel of a car every day or handing their kid off to a non-custodial parent or grandparent who they just threatened to cut off completely. You can worry about your all-expenses paid rendition holiday to the former Eastern Bloc; I'm going to keep monitoring my credit score and push to have companies disclose all breaches promptly and held liable for any personal damage that occurs. In 10 years, I'm sure I'll have more problems to deal with than you ever had.
And if you are doing anything to deserve such a trip, well... I guess that's proof the government spying works.
I have to wonder...
When people talk about the free exchange of information on the internet, did anyone stop to wonder who might be looking at that information? Here's a hint: It's not just the government, and they don't necessarily care if they come across something that is actionable intelligence that could save lives.
I realize that many people try to draw a bright line and say, "Government, you stay over here, while the rest of us will play over here." Besides the logistical impossibility of that, I think it's a rather dangerous game to play. I'm not even talking about the whole criminal enterprise aspect and government trying (vainly perhaps) to protect us, or the scammers who try to dodge and weave their way into a bank account or other ill-gotten gains. There is the fact that we have told the Government to stay out of our sandbox, while inviting our "friends" in who just happen to have resources equal to or greater than most governments to trawl through our tawdry details, all in the name of commerce.
We lie to ourselves by saying we can always do business with someone else, but does anyone actually believe that Facebook and others aren't aggregating enough information to find you elsewhere on the internet if they could profit from it? You can check all the boxes that say, "No, don't track me or sell my bank account to Nigerian princes," but it doesn't take many data points to at least predict your demographics, and a few more could narrow you down further to you or your terrorist twin in Algeria, the deciding factor being the result you click on when looking to add to your knife collection.
I don't condone the behavior of the American government, and at least I have a voice (ha!) as an American citizen. Perhaps even a bit more protection. And I don't subscribe to the "Nothing to fear if you have nothing to hide" bull, because it's the same claptrap that was pushed by Dick "Powered by Hate" Cheney. But I also don't think that my online persona, my personal communications, and all my real and virtual meanderings are free from use and abuse by other entities. While rendition is a terrible thing that no human should ever be subjected to, regardless of real or perceived intent, what about my credit score, my identity, my life being trashed by a bad algorithm, poor security, or corporate neglect or malfeasance. We should take to government to task for this, but that same harsh light and public interrogation should be turned onto the companies, entities, and others who have the same data and use the same mining and exploitation. If we fail to do that, whining about what the government is looking at is nothing more than mistaking a single tree for the entire forest.
Re: No SD slot?
Perhaps I'm missing something then. I often swap movies and music in and out of my phone using a USB cable and disk mode. And I have a removable micro SD card. It's just that by the time I remove the back of my phone, take it out, plug it into the USB port via its included USB cradle, and start the transfer process, I could have just plugged my entire phone in with a USB cable, selected disk mode, and started the transfer. And some phones with removable SD cards are hiding them in a way that requires tools and other procedures just to get it out of the phone.
While I get that a lot of the internal memory can be used by the OS and bloatware (7.5 of the 16GB on the S4 and with the added pleasure that they didn't allow you to install apps on an SD card you inserted), it does seem like much ado about nothing. I get if you are on a longer trip or stuck on a transcontinental flight, it might be nice to have more than a couple of movies with you. But we are still talking about a feature that isn't high on the list for most smartphone buyers, who rightly or wrongly have been wooed by light and thin. Perhaps it will be in the future, but with WiFi everywhere, pretty decent LTE coverage (at least stateside), and other things, on-phone storage of media might become rather antiquated. Data caps might strangle that idea before it can get out the cradle, though.
Re: No SD slot?
That's a use case that probably has many other requirements beyond a removable SD card.
I for one am glad to contribue to the downfall of man...
Err... I mean the advertising useless crap that I don't care about. If I'm at Newegg and they want to target me with ads about something that's related to other stuff I've browsed there, thank you. Same with any other site. It's bad enough to see the crapvertising that's based on my IP address, telling me that Obama commands me to get new car insurance, or I should contribute to some turd's reelection campaign in the armpit of my state. If I want to block third-party cookies, that's my choice.
Now if only there was a way to modify those third-party cookies into something malformed that made their database drop a huge load all over the floor....
While I still use my old-fashioned flim SLR for photography once in a while, it, much like its younger DSLR siblings, is bulky, large, and requires more than just pants pockets to keep on you. What do you consider an "actual camera"?
The compact digital cameras that mimic the old point-and-shoots aren't exactly head and shoulders above the Lumia 1020 on paper, and for good reason: for day-to-day photography (not just selfies in the bathroom mirror), smartphones have been destroying those $250 compacts. You (almost) always have your phone, so that quick pic at the bar or out on the lake (I'm a fisherman) is going to be handled by it. No need for a dedicated camera that is only slightly less limited than your phone (or more if being able to upload immediately is a major want/need).
If you want to take artsy selfies, portraits suitable for hanging, landscapes, nature photography, action shots, or low-light shots, the DSLR is still the best bet, as it has swappable lenses, larger CMOS, and much better optics. But as I said above, the problem is the size and bulk that prevents everyday use, plus the much greater cost. At the same time, I know that some of the camera companies are trying out compact DSLR-type cameras, in hopes of carving out a niche between smartphone and prosumer photography. This could make such a job harder... or promote a very nice middle-of-the-road option for people to use in a point-and-shoot-sized body.
Re: Someone (or people) much smarter than I
That's my point: The incumbents have the option to either combine the two use cases into a single OS or create two parallel OSes that share a design language. Both options have merit and problems.
And again, the actual sales numbers point to declining desktop and laptop usage, not just projections made for 2019 by Gartner and others looking to provide consulting services to flailing companies. In fact, last night the Q2'2013 numbers were just published by both Gartner and IDC, and it looks like worldwide sales dipped another 10% from the same period last year.
It's a niche that matured 5 years ago. A software company can't spend billions of dollars on maintenance and incremental upgrades to a platform that stopped being a growth category. For years people took Microsoft to task for keeping so much legacy code and operations that stretched back to Windows 3.1. It was repeatedly claimed that if only Microsoft rebuilt from the ground up could they create a pristine OS that would rock out world. Well, they took a bite at that cherry with Vista and botched the implementation. Win 8 came around and tried again, but they were upfront about the design language, and people spent the next year whining about how things had changed and how terrible it was that their modern OS specifically designed for the new wave of tech (touchscreens, specifically) didn't work how they wanted. So MS capitulated and came out with 8.1. Hopefully for their shareholders, a lot of money wasn't spent to give the start button back.
Someone (or people) much smarter than I
Was referenced above when it was pointed out that the desktop is a dying breed. Disregard the projections all you want, the numbers from the last few quarters point to the slow bleed of the desktop world. Due to laptops, tablets, netbooks, smartphones, or the back of a shovel with a rock, the desktop that I know and love is fast becoming the province of code monkeys and those who can't play FPS games without WADS and a mouse.
Even laptops are being displaced in some workplaces by tablet and smartphone combos (though you'd have to pry my EliteBook out of my cold, dead hands). Sadly, that means a modern OS is going to have to either account for that (Win8 and the Ubuntu attempts) or divide and (attempt to) conquer (iOS and OSX). The former strategy promotes bloat and waste while upsetting the apple cart just to frustrate and irritate the current users, while the latter provides a consistent experience that tricks users into thinking interoperability should be flawless.
Transition points are often no-win situations for the incumbents. MS made their bones when the world moved from command-line to mouse-driven GUIs. Yes, the grifted and stole from existing products, lied to collaborators about their intentions, and generally begged, borrowed, and stole their way to the top, but if it wasn't them, it would have been someone else. Apple got out to an early lead with iOS, but they've stagnated and rested on laurels, much like they did in the 80s. Android is just a way for Google to make money on ads and will lose its support if the shareholders ever find a way to force a spinoff or end to the various pet projects that happen. It might be for short-term gain and kill the company long-term. but that is what today's shareholders do best.
Perhaps some wear-leveling algorithms/circuitry combined with spare capacity, just like the solid-state drives today? I'm not an electrical engineer by any stretch of the imagination, so I'm sure there's some reason why that's an insurmountable issue or some such.
The practice of isolating specific isomers is very common among pharmaceutical companies as a way to "extend" patents. Take a look at the Omeprazole to Esomeprazole "conversion" that was done, along with the marketing material "proving" that the Esomeprazole version was more effective, when the evidence shows that the acidic environment of the stomach converts either version of the chemical into the bioactive version.
That's not to say handedness isn't important: there is a predilection towards left and right handedness depending on the class of biological chemicals you are looking at. However, the isolation and synthesis of the desired version isn't exactly the stuff future Nobel Prizes will be built upon.
Bad for traders, but still okay for investors
I think a central misunderstanding by arm-chair market participants is how bonds work. Yes, just like stock, they are commodity that can be traded. The price is supposed to represent a value above or below the face value of the bond itself, based on expected cash flow over the remaining bond term, other investment vehicles that could be used in place of the bond, etc. So if you are a bond trader, as in you rely on buying bonds at a discount and selling them at a lesser discount or even a premium, then yes, a large drop in the market value of the bonds will tank your portfolio
But if you are a true investor, as in, you are looking for guaranteed fixed income over a number of years, this is probably the best time to buy the bonds. They are so discounted that a buy and hold strategy would net you some pretty profits, assuming you want to lock the money up. Since the 30 year bonds have a 3.9% rate and you can buy them for 85 cents on the dollar, buy and hold would get you a 17% return at maturity, plus the 29+ years of interest payments. Is it the best use of money? Probably not if you have a burning desire to grow your portfolio more quickly, but if you're 65, 70 years old and want to lock in some kind of return for at least the next 10 years (30 years is a long time for a tech company, but that cash hoard should at least take 10 years to burn through), there are probably worse strategies out there. And if you are buy and hold, there are a number of 3 to 5 year maturity bonds you can probably buy off of panicked traders, though the yield is less than the current dividend yield of the AAPL stock.
And there isn't the added heartburn of "will they, won't they" every time a debt ceiling debate comes up in the US Government.
It's all about heat transfer
As discussed in many stateside papers and pop-science articles, the thought is that a higher-than-global-average warming at the poles is creating a more "uniform" distribution of heat across the globe. Since the air currents (upper and lower level) are mostly a function of such heat differences, it's easy to theorize that there would be a reduction in the speed and strength of those air currents. As those winds (especially the upper level ones) are integral to the movement of weather systems across the globe, any slackening of those winds would easy cause "stuck" patterns and allow the tropical and arctic air masses more time to intrude upon the temperate latitudes. The jet stream buckles more and only a kick in the teeth from a larger-scale phenomenon, such as changes to the tilt of the planetary axis, provide the necessary energy to move the jet stream around.
This is the "extreme" weather phenomena that is predicted to occur more, as those buckles are more amplified and move slowly across the globe. So one season might be extreme heat, the next might be extreme cold.
It's all fun and games until you have to run...
His disappearance is probably due to the HK's warning of extradition. Why he chose HK over a number of other areas is beyond me. Actually, why he did this at all is beyond me, if only because the big reveal (the government is spying on electronic communications!) is yesterday's news. It doesn't take a rocket scientist to parse together the immunity clause for telecom companies along with the authorization for warrantless (but judicially-approved) gathering of call and email logs contained within the Patriot Act and come up with the basic outline of the PRISM program.
Examination of the contents of specific emails and calls, if you are a US Citizen, typically requires a warrant (though that's not much of a barrier with the "special" court"). The sheer amount of data means that actually logging the contents of every communique would result in a huge noise-to-signal ratio, so instead they log and triangulate the sender, destination, duration, frequency, etc and look for patterns. Obviously we don't know the utility of such actions, but the likes of Google, Microsoft, Yahoo!, and others have been telling the market that such activities are huge moneymakers.
While I abhor the activities, until the Patriot Act is revised or trashed such things are legal. The actual legitimacy under the Constitution is both hard to determine (mainly, is this really unconstitutional search and seizure) and easy to dismiss (national security concern, no real mechanism within the Judicial branch to review such behavior or evaluate the national security claim).
So the solution....
Must be to regulate the ammunition instead of the delivery vehicle. Anyone with a bit of ingenuity can make a dangerous (to the user) vehicle to deliver a bullet in a specified direction. Heck, they might even hit the broadside of a barn once or twice (without suffering injury). But without ammunition (which can be hand-packed today, I know), these plastic, pipe, prison, etc. guns are useless. So just make the ammunition hard to obtain. If that means people go back to black-powder muskets in an attempt to circumvent the rules, so be it. I'll take my chances staring down an barrel that isn't rifled shooting a lead ball that couldn't be described as spherical.
Re: Not playing their game
HBO in general waits nearly a year to put the season out on DVD. Even here in the states.
Re: @Eric Olson
I'm stateside, so our system is a bit different. For one, there were no qualifying assessments made before graduation. It was either successful completion of the classes required or not. Arguments about the merits and issues with the two educational systems aside, anecdotal evidence suggests that my education in data analysis exceeds the education my younger brother received when he obtained his computer and electrical engineering degrees. However, his knowledge of IEEE standards and the National Electric Code far outpace my own. As does his salary.
Not sure how psychology is taught in the UK...
But at least where I was 10 years ago, psychology was a statistics-heavy degree. It put you miles ahead most other graduates outside of the math department. Maybe it's the kind of attitude on display in this article that pushed psychology to fiercely legitimize itself through math, but statistical analysis and quantitative analysis are huge parts of the degree in many schools today, which has allowed me a career in IT and (currently) software business analysis (with no degree-based IT experience or training).
More broadly, t I think this is again a problem with the idea that people who are in school now should be literally placing bets on the usefulness of a specific degree 10, 15, 30 years from now. In 2000, the big degrees were in finance and pre-law, both of which now are suffering from massive over-supply issues. Right now it is Comp Sci and Engineering, two very specific skill-sets that are already showing signs of over-supply, at least at the entry-level. While "doing what feels good" might not be the best strategy, it seems those of my cohort who came out of college with no specific job or field in mind are doing the best and are the most satisfied with our jobs... excluding the teachers.
But that's my two cents.
Stay with me here...
Storms move along steep pressure gradients, typically caused by thermal differences. High pressure moves into areas of low pressure. A low-pressure system won't move west, east, north, or south without something pulling it along. If an increase in air temp throughout a given column of air above the polar regions reduces such gradients, blocking highs will become the norm. Such blocking highs would continue to cause continued buckling of the atmospheric flows, resulting in energy being shunted off in directions normally not seen.
Taken together, this is why the whole idea of "extreme" instead of "warm" weather is predicted by most climate models. With more energy in the system, it takes an even greater push to move highs along their merry way. This allows diversions of upper-air winds to carry tropical or polar air further into the temperate regions. Tropical air tends to carry moisture, polar air tends to be dry. In general, that means a pattern is formed and harder to break. That's not to say such things did not occur in the past. But should we just accept that by adding more energy to the system in a way the creates more equilibrium than before (polar areas are warming much more than tropical areas) is a good thing? It just creates a more sluggish climate that responds only to gross changes in energy, such as the changing from summer to winter, rather than smaller energy fluctuations. The climate works on a system of equalization, tropical to polar. Reducing that difference reduces variability and cements patterns. That's not a good thing.
Re: Scientific Terminology
While the media do certainly get a bit excitable when things like this happen, the facts are that the storm lost its tropical characteristics before landfall, which was later confirmed in post-event analysis. The National Weather Service, in fact, has been called to task for not only keeping its Hurricane and Tropical Storm Warnings active when the storm was in fact no longer tropical in nature, but also for confusing the subject by trying to explain in the text of those warnings that it wasn't in fact a tropical storm. According to the NWS, they didn't want to mislead the public into thinking that the phenomenon associated with tropical storms (storm surge, tornadoes, squall lines, hurricane-force winds, etc.) no longer were present just because the storm had transitioned from from tropical to extra-tropical.
It doesn't help that many homeowners policies for the coastal United States have extremely specific clauses regarding hurricane, tropical storm, nor'easter, and other storms that come off the ocean as opposed to from the land, which makes classification even more important from a legal standpoint, as usually if it's classified as a Hurricane, insurance moves from a replacement policy to a shared or large deductible (I think 10 to 15% of the insured value of the home) policy. That's quite different from a $500 deductible when your home is worth $150,000.
I know this is about old games...
But anyone else get a chance to play with the new SimCity a few weeks back? It seemed to have a similar amount of entertainment I remember from the original, without getting deep in the weeds like SimCity 3000 ended up doing.
Well, now we need something else to complain about...
It doesn't have a 10-key? Is Dell intentionally selling a crippled laptop in order to make Windoze machines look better?! </sarcasm>
- iPad? More like iFAD: This is why Apple ran off to IBM
- +Analysis Microsoft: We're making ONE TRUE WINDOWS to rule us all
- Climate: 'An excuse for tax hikes', scientists 'don't know what they're talking about'
- Analysis Nadella: Apps must run on ALL WINDOWS – PCs, slabs and mobes
- Major problems beset UK ISP filth filters: But it's OK, nobody uses them