Re: Best OS Ever
In other words, Windows 2000 was the best version of Windows ever. I would certainly agree it was one of the better ones, essentially XP but with an interface not designed by a five year old.
816 posts • joined 27 Dec 2008
If I were the judge, I would find...
It's probably a good thing you're not the judge then. Call me old-fashioned but I'd expect the judge deciding any case I was involved in to listen to the case. Allowing the case to proceed and not announcing your verdict before the case has been heard is a necessary prerequisite.
There are way too many here who fall into the same trap. Reading a few brief articles in the media does not make you an expert in the matter. If you happen to be legally qualified and have considered all the evidence available then maybe you have an opinion that is worth listening to. Otherwise it is an ill considered and ignorant knee jerk reaction.
As it turned out, the future of 64bit computing wasn't Intel's IA64 (Itanium), it was AMD's AMD64 *product* (which was not slideware like IA64 was at the time)
Intel released Merced in June 2001. The first AMD64 chips were released in April 2003. You make soem good points but chosen to base your whole argument on a factual inaccuracy.
He's also never been tried and convicted so the judge is out of order treating him as if he has been tried and convicted.
The judge made a finding of fact that the machines contain data that doesn't belong to Love. She is then perfectly correct to use that as a basis for the conduct of the rest of the case. The courts would be paralysed with a neverending merry-go-round of hypoethetical arguments if that wasn't the case. The court explores an issue, the judge makes a determination and then the case proceeds on that basis.
Amazon are the worst offender for that: about twelves months ago they changed the unlock process on a Kindle Fire. I forget the details but you had to swipe in a different direction to previously. No explanation of the change, the adverts on screen still there, not even an "this has changed and you'll need to do this now..." on there. Only took a few seconds to figure out but leaves a very nasty taste in the mouth.
I'm in two minds. One thing I do notice when switch between Libre/OpenOffice and MS Office is that the free alternatives are painfully slow in comparison. I hate to say it but it is one thing MS have got right.
On the other hand MS are doing their level best to make using Office as slow and cumbersome as possible with stupid UI decisions, whether it be the Ribbon and the constant tab switching that involves or the "Shove more service in your face" decisions such as presenting cloud storage almost as the default save location and making navigating local directory hierarchies a ballache.
I agree. The absence of any kind of real evidence (no, anonymous claims with nothing to back them up are not evidence) is what make me dubious of these claims. If this had been noted and properly reported by someone reputable it would be headline news and Huawei would be dead in the water in foreign markets at least: put simply they can't afford to insert backdoors, it risks the entire company. Look at how Supermicro were affected over unsubstantiated and incredible claims over a small number of motherboards being hacked, and then consider what would happen to any company installing that sort of thing deliberately and as standard if it became public knowledge.
So yes, when you have something real to report from your reverse engineers let us all see it, until then this is nonsense. I somehow doubt we'll hear back from you. Although I must admit I'm really impressed by your technical prowess. Getting Wireshark to tell you not what is on your LAN, but what is going out the 3G side of a 3G modem. That's quite some feat...
I thought that as well, didn't even need to look it up to see somethign was dramtatically amiss. I recall reading there's actually very little matter in the rings considering how prominent they are - if you bundled it all up it would form a moon "only" 100 miles across.
As for the substance of the report I'll continue to have that down as "no one knows" simply because I've seen too many conflicting reports in the past. The prevailing wisdom tends to be they are unstable, perhaps ten years ago the view was they are less than a million years old. Then someone did a simualtion and worked out they could indeed be stable and be up to 4 billion years old. Now a new simulation (admittedly with new data) is saying the opposite.
Of course, I'm not saying they're wrong, just that I'm not convinced.
Who said anything about specifically government funding. Since I've already mentioned electronics a lot of that evolved from the likes of Bell Labs in the 30s, 40s and 50s., i.e. the private sector. However there are comparatively few companies that engage in that kind of fundamental research and they need deep pockets to fund things with no clear payoff. If you left it all to those billion dollar companies all you do is concentrate ever more power in those few select organisations.
I appreciate that CERN furthers knowledge in theoretical physics, but I'm hard pushed to see what benefits there are to knowing the mass of a neutrino.
That's the whole point: we don't know what we'll find, which is often true of fundamental research. However, if the same attitude had been adopted 100 years ago the whole of nuclear physics and modern electronics would have completely passed us by. In the 1850s no-one was saying "wouldn't it be great if we had transistors and microchips" but those are from one area of huge practical consequence.
Is it really more expensive to manufacture SSD than it is to manufacture spinning rust with all its separate mechanical components and precision assembly needs, or are SSDs presently priced higher because they can or because market demand still exceeds production?
On a per-unit basis, probably not, the problem is the huge sunk costs in semiconductor fabrication that need to be recovered over a relatively short lifespan. Sure a factory making spinng rust drives is working to tight tolerances but those are largely a given in high quality mass production environments - one of the favourite examples is that a Swatch watch is made to finer tolerances than a Rolex, not because that accuracy is needed, it is simply inherent in the process. Then after a few years the retooling a hard drive factory for the next model is perfectly realistic - i.e. the one from five years ago making 1TB drives can be converted to make next years 12TB units.
On the other hand semiconductor fabs seem to run to a couple of billion a piece these days before you even start making chips, and that sunk cost has to be recovered over a very definite lifespan: if you want to upgrade that 22nm fab from five years ago to 10nm for next year you are essentially starting over in terms of plant. The same goes for 3D and similar semiconductor advances.
But you are on shared spectrum - you and everyone else in the locality the same network are using the same bandwidth. If you want more bandwidth you can pay for it: if enough people do the same there is the economic incentive to install more masts, make the cells smaller and thus create more capacity.
But you don't want to do that: if you did you would have done so. Indeed, you explicitly state the motivation is to drop your DSL, presumably to save money.
If you're using what is effectively a community resource why should you be entitled to more than your fair share if you are not willing to pay for improving things to compensate for your additional use?
Mobile spectrum is always going to be finite and at a premium because of that. "Me wants and screw everyone else" does not alter that.
The metric system is a tool of the devil! My car gets 32 rods to the hogshead and that's the way I likes it!
But the inch, yard etc are defined in terms of their metric equivalents. In a very real sense that makes them metric units in that if the defintion of the metre changes so does the inch.
"If the address is valid" is no more secure than "if the phone number is valid"
If the address is valid you are assured that it is the correct recipient. As I indicated they may have moved to ano0tehr orgnaisation but it is still the same person and they are still within the NHS. With a fax number you have no such guarantees.
Fax is "secure", but maintaining it requires a phone line.
No it's not, there is more to security than interception or man in the middle attacks. The big problem with fax is that there is no method of asserting the recipient is the intended one - the details could be out of date or the number misdialled - you don't know until the notes have gone somewhere they shouldn't have. And yes this is a very real concern, I know from experience GP practices won't keep contact details up to date even with the organisations that pay them. What hope for general directories?
With NHSMail things are not ideal because the address is assigned to the individual rather the organisation (excepting group mailboxes) but if the address is valid they are still within an NHS care provider somewhere and presumably bound by professional standards of confidentiality, even if they no longer work for the intended organisation.
Memory is today one of the slower parts of computing. Whenever your CPU actually has to access it it takes a long time. Caching solves a bit of the problem, but it quickly gets very difficult.
Memory isn't inherently slow, certainly it can be physically made far faster than current modules. The problem is the interconnect: the physical distance adds latency and limits on how quickly you can modulate even PCB traces limit the bandwidth. I suspect in the long term massively parallel systems with relatively small on chip stores will be the answer, but making that work needs a fundamentally new programming paradigm and new algorithms.
To be fair Sennheiser make some genuinely decent kit, it isn't bling and flash for the sake of it. Look at their product range and the kind of accessories you cite are conspicuous by their absence, they're not the same as e.g. Beats, charging a premium for stuff that is at best mid-range.
Yes, a lot of the premium priced brands are simply marketing with nothing of substance to back it up, but genuine high-end audio gear commands a fair price too. You need to distinguish between the two.
I've been on both ends of this, both having to write procedural guidance and follow rarely used stuff for e.g. end of year processes. I can sympathise with the authors, it's very difficult to write technical documentation that essentially boils down to a sequence of steps in a manner that does not sound robotic. Your intiial draft often ends up as "select this", "click this", "click this", "click this"... and the resulting prose sounds awful even if broken up by bullet points or numbered steps. It's also quite difficult to follow since all the steps seem so much alike and can be confused - at this dialog do you follow the "click OK" step or the "click cancel" step?
You do try and mix up language and sentence structure as much as possible but there are rules that you are best rigidly following, such as as here, alway presenting steps and information in the order it is needed. Out of order even within the same sentence, such as "Do this after having done that..." is guraranteed to cause errors because people will have done "this" before reading ahead the few extra words which cover "that".
It's never going to happen in any case, it is nonsense from the Brexiteers. No, nothing has been officially announced, finding a couple of random eurocrats vaguely musing the possibility does not amount to official policy statements.
On the other hand we had already negotiated ourselves out of ever closer union. Also consider the nations remaining in the EU. On the one hand you have a nuclear power and multiple NATO members. On the other hand you have the likes of Ireland with a constitutional commitment to neutrality. You can't negotiate away such divergent attitudes and the proposal inevitably fails there.
I'm seeing much the same. For the last couple of years it's been really hyped but come the day it's inevitably turned out to be 5-10% discounts on 5-10% of products. Doesn't really excite me. In fact the only Black Friday offer I can remember ever taking advantage of was a 25% discount on a magazine gift subscription for a friend last year. For the last couple of months I keep getting reminders to "renew" that but I figured they'd be offering the same discount again tomorrow.
Went on the relevant site last Sunday, they weren't offering the Black Friday discounts just then (unlike so many other sites) but were saying what the discounts would be and when (25% again, Friday-Monday). That makes no sense to me at all, it is effectively saying to your customers "Don't buy this now, wait a week and save yourself some money, if you think to come back at all".
For stuff I am buying locally I find the real discounts are achieved through simple procastination, it's amazing how the prices drop after Dec 20...
New IT initiatives will fail so they are not addressing a fundamental problem in health care; it is incredibly personnel intensive with much of the personnel being highly trained and expensive on the payroll.
I don't disagree with you but you need to look a little deeper than the survey. I note it refers to digital access services, i.e. customer facing services. I'm not surprised. I'd expect greater support for back office services if they cut costs and/or improve the service. You wouldn't believe how much resistance you get within the NHS to any kind of change.
For example, I've encountered practices who will for example flat out refuse to accept patient notes via email (typically under the premise "it's not secure, can't you fax it?") despite the fact NHS regulations (and the professional bodies) confirm that approved secure email is in fact the preferred method of transferring notes in an ad hoc manner (fax is only a last resort because of it's security concerns. Oh, and they're contractually required to accept notes by email.
Many NHS IT projects, especially ones that cross organizations, face a difficult deployment because of those attitudes rather than any technical factors.
Thankfully we have here The Register's army of commentards, who are sure to remain universally calm and rational!
One would have hoped there were enough clues in the article but not for the first time something like this has clearly gone "whoosh!" straight over the heads of many commentards.
Seriously, a commodity USB wifi/Bluetooth combo is a "pretty powerful IoT device", and obviously a program called "logger" is automatically suspicious on a Unix system. You expect that on Reddit but you'd expect at least enough nous to recognise satire here.
If the entity making a claim can not or will not produce proof of their claim, then the other party can legally sue for libel. If the paper refuses to produce one of the supposedly compromised boards, SuperMicro can take them to court & sue the shit out of them.
They can try. However if it goes to court the onus is on the party making the claim to provide the evidence - in libel cases this is inverted to how you might expect, i.e. the potentially libelled party has to prove they were libelled rather than the libelling party proving what they have said is true, since the case is about the libel rather than the original allegation.
That makes the case difficult to prove absent evidence of conspiracy or some other skullduggery since you are trying to prove a negative - the inability to produce boards that do not exist is not itself evidence that the supposed boards do not exist.
The first pis did use a linear regulator, but recent revisions are all switching.
That's a goer for 5V->3.3V since you are scrubbing 1.7V or 1.7W at 1A. Go down from 12V it's then 8.7W disspiated by a linear regulator which is getting into the region where you need to add a reasonable heatsink.
When it came to the Cyrix 486DLC vs SLC, though, there was a noticeable difference. Both were essentially 386 chips with the additional 486 instructions bolted on but with no performance enhancements. The SLC was crippled further by using a 16 bit external bus just as the 386SX. People thought they were getting a 486-alike when the reality was was more similar to that 386SX.
Us low-power/embedded people are annoyed at the 5V too - but in the other direction.
Can't say that I've ever encountered that attitude. Indeed it would make interfacing difficult for the average user case - how can you plug in USB peripherals without a 5V supply for them? I admit havign 5V power but I/Os that are not even 5V tolerant isn't the most comfortable design decision though.
I'm already somewhat annoyed by the use of 5v on a micro USB port, given the limitations of the port itself for carrying current, which thanks to Ohm's law, has to be greater than if a higher voltage had been used, and makes the system more vulnerable to reduced power thanks to the high resistance leads and connectors that are to be found too often on the typical cheap Chinese power supplies people will be using.
That isn't actually Ohm's law, but simply an inevitable consequence of the way the units are defined, i.e. J/C (volts) times C/s (amps) inevitably cancels to J/s (watts), no scientific law, it's just maths.
I do understand the concern about the power connection but a barrel connector wouldn't be my first choice, I'd prefer the option of something with a positive lock that can't simply be tugged out. Powering using 5V attached to the GPIO port may be an option, how practical that is depends on your application. As for the supply voltage, I suspect that comes down simply to the cost objectives. Dropping the power down from e.g. 12V means either a switching (cost) or a linear (heat!) regulator and either do not allow for the use of a standardised commodity power supply the user may well already have lying around.
This is part of the problem when it comes to deploying FPGAs - the superficial similarity to software leads the naive write code that is logically structured in a manner equivlaent to conventional computer programs. The problem is that what is a natural fit for software - essentially a sequence of steps - doesn't fit very well when expressed as hardware, and you inevitably lose the parallelism advanatges that make FPGAs attractive in the first place.
The best programmers making the switch tend to be those with good backgrounds in functional programming, they are much more used to the mindset of "This is a relationship that always holds" rather than "do this, then do this, now do that..." Unfortunately FP isn't particularly fashionable and the talent pool is limited.
For local HA, with redundant equipment, when a disk, switch or server goes down automatic is fine.
This strikes me as being a LAN issue though at one or the other site. A fault that caused connectivity to rather than on the WAN link to be lost. The reason I suggest this is that in default trim a 43 second outage is suspiciously close to the time traditional STP will take to kick in and rearrange the active links.
These days you would expect people to be using at least RSTP which is almost info infinitely faster to respond but only for some of the more common faults. In particular it doesn't detect the case where a link fails on one direction in a consistent manner, in that case you are still dependent upon the original protocol to sort it out with the delays that come with it.
Now, yes this may take a while to come down in price but saying they need to rebuild the networks is false. They first need to actually build a decent network which they continue to drag their feet with. And as for "paying for the investment" you see all them dividends and profits?
Now, yes this may take a while to come down in price but saying they need to rebuild the networks is false. They first need to actually build a decent network which they continue to drag their feet with. And as for "paying for the investment" you see all them dividends and profits?
The endpoint devices still need to be paid for. Even fibre needs replacing at times - it isn't a single one size fits all quantity , and as you note a lot of the network is still copper after a lot of investment already undertaken. As for where the money comes from - take a look at BT's annual report for this year: Openreach made an operating profit of £1.2bn compared against capital expenditures of £1.6bn. Again, if you want to multiply the amount invested by ten or so where does the money come from. I take it you are volunteering to pay since it is such a non-issue? Cheers for that, we'll all appreciate it.
Like it or not the phone network was built, evolved and paid for over decades. If you expect the entire network can be ripped up and replaced every couple of years you are simply divorced from reality.
So you as a customer would rather pay 10x what you do now so they can completely rebuild their networks every 18 months? That delivers a minor incremental benefits you can't even feel?
Of course things like this take time to get rolled out, starting with the pinch points on the current network where the greatest benefit is felt soonest?
By all means argue greater investment is needed but try to keep your argument based remotely in reality: you are not willing to pay for the level of investment you demand. If you're not willing to pay for your faster broadband who do you propose should, especially when that 'investment' is being done in such a foolhardy manner?
In terms of the cable thickness, the Supercharger lines are actually not much thicker than the hose of a petrol pump.
Thickness is one thing, stiffness is quite another. My father used to work on diesel generator sets and they would deal with heavyweight cables on a regular basis. The actual engineers would always insist unreeling cables was a two man job once it got into the 150-200A territory.
Wasn't very popular with management. Especially if it was a hire set for a weekend, as if that makes a difference to the properties of the required cable.
What reports like this do not quantify is what is regarded as sexual harassment. If it is left entirely up to the victim this may give results that independent observer would consider misleading even if the reports are themselves honest i nthe eyes of the complainant.
I would think that most people would consider that a bloke asking a woman out for a date is fair game. Sometimes this could be slightly awkward but that is the nature of the enquiry, I wouldn't consider that to be harassment unless if it repeated after a clear "no" signal has been delivered. On the other hand if the woman is asked twenty times over by twenty different men that could easily be construed as harassment collectively even if each bloke is simply trying his chances once. I have seen awkward invites out in the workplace, and while you may be left thinking "That could have been smoother" you are reluctant to attach blame to the admirer.
On the other hand perfectly innocuous behaviour can be misconstrued. I recall perhaps six months ago a girl at work came up to me in tears with a bit of a crisis of confidence over what she was doing: I instinctively reached to put my hand on her shoulder. She flinched before I made contact, so I withdrew the hand and apologised which she immediately accepted. Sexual harassment? To be honest I probably wouldn't have done that if it had been a male. I yes, I'm aware that even referring to a member of staff as a "girl" would be considered sexist by some, but when this is a 19 year old and young enough to be my own daughter I don't want to sexualise her (in my own head at least) by calling her a "woman". Am I being sexist there?
We have two simple everyday cases here that could be taken as harassment dependign on the attitude of the possible victim. Self-reported studies like this certainly help to show at least a perception of a problem which may be a problem in and of itself. However this is an area where context is key and a simple "Have you ever been harassed?" conceals at elast as much information as it conveys.
I still think there's a place for something between feature and smartphone for business use - secure messaging and email, calendar sync, phone calls and long battery life are all I need
The problems there are twofold, firstly everyone's list is slightly different and secondly the wishlist is often created without any regard for the costs of of what is being requested. Oh, and if at this point you start whining about "inefficient software" or "incompetent developers" you are part of the problem: slimming things down dramtically means making hard choices rather than shifting the blame.
Firstly the kichen sink attitude - "This is an app I need, therefore everyone else needs it too or it is useless for everyone." Like some of the comments here on the Punkt phone the other week - Oh, it's useless for business without WhatsApp... seriously? For business? I can sympathise slightly more with another user's demand for Slack even if I don't use it myself but adding that it starts to look more like a conventional smartphone. Added WhatsApp as well, well we may as well call it a smartphone.
The second is user expectations: basic plain text email is easy so we'll have that. You want HTML formatting? That adds bulk. Want to be able to read that attached PDF or Word document? Bang, you're back at a smartphone.
I think the only way you could do this is to ask not what you can remove but start from scratch and ask what it an absolute must have. Calls and SMS - well if it doesn't have those it wouldn't be a phone. Something email-like even if it has to be curated via some proprietary server to simplify the client, like e.g. like Blackberry was. A web browser is itself a major chunk of software these days thanks to the layer upon layer of technology on the modern web, but a cut down version cut be provided serving mobile-targeted content, a modern WAP if you will. For bespoke business applications some database front end targeting a remote server so people can submit timesheets, delivery notes or whatever it might be for that particular organisation.
Start looking at it that way and you have reduced the requirements enough to make a noticeable difference to the requirements of the handset rather than fiddling at the margins. However it's also looking like a WAP-enabled feature phone from 2005. People have moved on since then. I really do hate to say it but it appears people are in general willing to put up with the limitations of modern smartphones if they can be promised all the services they have on that modern smartphone.
I remember reading about car manufacturers experimenting with plastics that deformed when heated so that it was easy to separate them from metal components for recycling.
ISTR hearing another approach for cars where they would point an ultrasonic "gun" at the car which would cause specially designed plastic bolts to fracture so all the plastic trim simply falls off. The idea failed when it was observed this could be done by anyone with the gun whether they owned the vehicle and were in a scrapyard or not.
Because they don't know. The instrument taking these data is basically just a mass spectrometer - it can measure how heavy a molecule that hits it is, and that's it. They can see that there's a bunch of stuff with atomic mass 28u, which can mean N2, CO or C2H4. And from other data they can infer that at least some of that is C2H4 released from the breakup of bits of the dust and other crap generally floating around the place. But the data here can't say anything about what it was all actually made of before that point.
It's considerably more refined than simply looking at the total mass of the molecule. The ionisation process so the compound can be accelerated naturally breaks up some proportion of the input. The fragments get detected in addition to the complete molecule so you have to consider the potential fragmentation points of a candidate molecule and how those correlate to the data you see.
For example ethane, relative molecular mass 30 which is what the bulk of the signal of a pure sample would return. However you could shave a hydrogen atom off giving results at 1 and 29 as well. Or it could be broken between the carbons giving another report at 15. You have a lot more data for identification than a simple report of the total mass.
Modern li-ion batteries manage about ten times the energy density (in J/kg) of lead-acid.
Which isn't what he said. In terms of POWER density, i.e. the ability to deliver energy quickly for a given weight, there is still little to touch lead-acid, especially at a similar price point. There is a reason they get used for conventional car batteries, UPSes etc...
Quite possibly the latter. With HT the two logical cores are not equal - you have the lead core and the second essentially picks up the slack when the first is stalled so runs for only a small miniority of the time. If the scheduler is not aware of the distinction a process can get sqeezed by consitently being scheduled on the "reserve" core. Parodoxically this is more likely to causes issues systems that are otherwise underutilised - on a system with a lot of load on it processed get put on and pulled off cores frequently enough that everything evens itself out over not too long at all.
HT does not add additional cores, it simply enables a core that is stalled on one task to get on with another while the stall clears. In that sense the entire core is contended. It isn't a clear win but it is to misunderstand the basic principal to complain in essence that a single core doesn't have two FPUs.
And Sun was OK with Google copying the API's. Oracle is hardly in the right for coming along afterward and changing things. This is reminiscent of the company that bought the JPEG patent and tried to bill the world.
I think you mean the GIF patent. JPEG has never been encumbered.
It would be better if they insisted that plugs and sockets were correctly physically fitted and designed to last. Lost 2 laptops due to the pathetic connector falling off the mother board taking half its tracks with it. My nice shiny new mobile phones charger connector (USB) already doesnt change most of the time because the cable is already knackered - after about 10 times of use! My old Nokias still work, still charge with a very simple connector.
The Micro USB standard was expressly designed so that the (cheap) cable would fail before the socket on the (expensive) devices does so this is actually by design. On the other hand, how long the lead lasts seems very much in the hands of the user. I'm now on the third charging cable in my desk at work in the last two years - I've used them all at least every other day without issue. The two replacements that were needed were on the odd occasion they were lent to a colleague. And yes, it was the same colleague in both instances...
Crikey! They saw you coming didn't they. 200 squids and they didn't even change an O-ring and pressure test it (that's all that seal is, seriously).
That isn't the issue here, the issue with resealing is that it needs to be done in a vacuum chamber: it is the pressure difference inside and out that restores the waterproofing. Personally I wouldn't trust a "service" from someone not willing to vouch for the seal afterwards - it isn't difficult but needs the equipment, and if they cut corners there what else have they done.
Not that a watch service should cost anywhere near £200, or need doing anything like annually.
That has *nothing* to do with GPS, a technology that wasn't commercially available until about 2000. I know its hard to remember a time when we didn't have GPS everywhere (even in our pockets) but this tech was non-existent for commercial purposes until the year this patent was published.
Your dates are well off there. I remember seeing GPS units in Maplin when doing my A levels (95-97), they caught my eye not because they were new at that point but because of how cheap they had become, from memory down to around the £100 mark instead of five or six times that.
Sure they weren't linked to large scale mapping in an integrated unit (at best you'd have a map of motorways and major A roads, enough to navigate to the first half of a post code) and selective availability was still in play so you didn't get the full accuracy, but as a tech it was around at least five years earlier than you think.
the whole idea with IDNs is back-asswards, everybody needs to learn Latin script anyway, for languages that use diacritics, loosing them is not a huge problem (and I speak two of them)
I take when you go abroad you simply talk a little bit slower and a little bit louder? The world does not revolve around you and people don't need to learn a foreign alphabet simply because it fits in with your world view.
A printer was already mentioned, but a card reader with receipt printer would probably sell
In some respects they are already out there, as a separate unit communicating with your phone via Bluetooth or WiFi. I don't really see what you would get by more tightly integrating it into the phone - it would simply add a hell of a lot of bulk for something that would probably be better integrated as a separate unit in any event. You probably could reduce a card reader/printer down to perhaps half an inch depth at the cost of a standard and proprietrary paper size (something like a Post-It note). Even a small paper roll wold at least double that depth making the combined phone seem very bulky.
In any case, would you really want to hand over your own phone to customers dozens of times a day if there is not good reason to?
Biting the hand that feeds IT © 1998–2019