Thursday, 17 September 2015

More automation, fewer jobs ahead

Internet of Things in 2025: The good and bad

Within 10 years, the U.S. will see the first robotic pharmacist. Driverless cars will equal 10% of all cars on the road, and the first implantable mobile phone will be available commercially.

These predictions, and many others, were included in a World Economic Forum report, released this month. The "Technological Tipping Points Survey" is based on responses from 800 IT executives and other experts.

A tipping point is the moment when specific technological shifts go mainstream. In 10 years, many technologies will be widely used that today are in pilot or are still new to the market.
INSIDER: 5 ways to prepare for Internet of Things security threats

The Internet of Things will have a major role. Over the next decade there will be one trillion sensors allowing all types of devices to connect to the Internet.

Worldwide, the report estimates, 50 billion devices will be connected to the Internet by 2020. To put that figure in perspective, the report points out, the Milky Way -- the earth's galaxy -- contains about 200 billion suns.

The ubiquitous deployment of sensors, via the Internet of Things, will deliver many benefits, including increases in efficiency and productivity, and improved quality of life. But its negative impacts include job losses, particularly for unskilled labor as well as more complexity and loss of control.

Robotics, too, will be a mixed bag. It will return some manufacturing back to the U.S., as offshore workers are replaced with onshore robots. But robotics -- including the first robotic pharmacist -- will result in job losses as well.

There's concern that "we are facing a permanent reduction in the need for human labor," said the report.

That may still be an outlier view. Efficiency and productivity gains have historically increased employment. But a shift may be underway.

"Science fiction has long imagined the future where people no longer have to work and could spend their time on more noble pursuits," the report said. "Could it be that society is reaching that inflection point in history?"

That question doesn't have a clear answer. The Industrial Revolution destroyed some jobs but created many more, the report points out. "It can be challenging to predict what kinds of jobs will be created, and almost impossible to measure them," the report notes.

Other predictions included:
Driverless cars will make up one in 10 of the vehicles on the road, and this will improve safety, reduce stress, free up time and give older and disabled people more transportation options. But driverless vehicles may also result in job losses, particularly in the taxi and trucking industries.

One in 10 people will be wearing connected clothing in 10 years. Implantable technologies will also be more common, and may be as sophisticated as smartphones. These technologies may help people self-manage healthcare as well as lead to a decrease in missing children. Potential negatives include loss of privacy and surveillance issues.
The forecasters were bullish on vision technologies over the next decade. This is tech similar to Google glass that enhances, augments and provides "immersive reality." Eye tracking technologies, as well, will be used as a mean of interaction.

Unlimited free storage that's supported by advertising is expected by 2018.


Saturday, 5 September 2015

Microsoft, U.S. face off again over emails stored in Ireland

The company has refused to turn over to the government the emails stored in Ireland

A dispute between Microsoft and the U.S. government over turning over emails stored in a data center in Ireland comes up for oral arguments in an appeals court in New York on Wednesday.

Microsoft holds that an outcome against it could affect the trust of its cloud customers abroad as well as affect relationships between the U.S. and other governments which have their own data protection and privacy laws.

Customers outside the U.S. would be concerned about extra-territorial access to their user information, the company has said. A decision against Microsoft could also establish a norm that could allow foreign governments to reach into computers in the U.S. of companies over which they assert jurisdiction, to seize the private correspondence of U.S. citizens.

The U.S. government has a warrant for access to emails held by Microsoft of a person involved in an investigation, but the company holds that nowhere did the U.S. Congress say that the Electronics Communications Privacy Act "should reach private emails stored on providers’ computers in foreign countries."

It prefers that the government use "mutual legal assistance" treaties it has in place with other countries including Ireland. In an amicus curiae (friend of the court) brief filed in December in the U.S. Court of Appeals for the Second Circuit, Ireland said it “would be pleased to consider, as expeditiously as possible, a request under the treaty, should one be made.”

A number of technology companies, civil rights groups and computer scientists have filed briefs supporting Microsoft.

In a recent filing in the Second Circuit court, Microsoft said "Congress can and should grapple with the question whether, and when, law enforcement should be able to compel providers like Microsoft to help it seize customer emails stored in foreign countries."

"We hope the U.S. government will work with Congress and with other governments to reform the laws, rather than simply seek to reinterpret them, which risks happening in this case," Microsoft's general counsel Brad Smith wrote in a post in April.

Lower courts have disagreed with Microsoft's point of view. U.S. Magistrate Judge James C. Francis IV of the U.S. District Court for the Southern District of New York had in April last year refused to quash a warrant that authorized the search and seizure of information linked with a specific Web-based email account stored on Microsoft's premises.

Microsoft complied with the search warrant by providing non-content information held on its U.S. servers but filed to quash the warrant after it concluded that the account was hosted in Dublin and the content was also stored there.

If the territorial restrictions on conventional warrants applied to warrants issued under section 2703 (a) of the Stored Communications Act, a part of the ECPA, the burden on the government would be substantial, and law enforcement efforts would be seriously impeded, the magistrate judge wrote in his order. The act covers required disclosure of wire or electronic communications in electronic storage.

While the company held that courts in the U.S. are not authorized to issue warrants for extraterritorial search and seizure, Judge Francis held that a warrant under the Stored Communications Act, was "a hybrid: part search warrant and part subpoena." It is executed like a subpoena in that it is served on the Internet service provider who is required to provide the information from its servers wherever located, and does not involve government officials entering the premises, he noted.

Judge Loretta Preska of the District Court for the Southern District of New York rejected Microsoft's appeal of the ruling, and the company thereafter appealed to the Second Circuit.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, 31 August 2015

10 security technologies destined for the dustbin

Systemic flaws and a rapidly shifting threatscape spell doom for many of today’s trusted security technologies

Perhaps nothing, not even the weather, changes as fast as computer technology. With that brisk pace of progress comes a grave responsibility: securing it.

Every wave of new tech, no matter how small or esoteric, brings with it new threats. The security community slaves to keep up and, all things considered, does a pretty good job against hackers, who shift technologies and methodologies rapidly, leaving last year’s well-recognized attacks to the dustbin.

Have you had to enable the write-protect notch on your floppy disk lately to prevent boot viruses or malicious overwriting? Have you had to turn off your modem to prevent hackers from dialing it at night? Have you had to unload your ansi.sys driver to prevent malicious text files from remapping your keyboard to make your next keystroke reformat your hard drive? Did you review your autoexec.bat and config.sys files to make sure no malicious entries were inserted to autostart malware?

Not so much these days -- hackers have moved on, and the technology made to prevent older hacks like these is no longer top of mind. Sometimes we defenders have done such a good job that the attackers decided to move on to more fruitful options. Sometimes a particular defensive feature gets removed because the good guys determined it didn't protect that well in the first place or had unexpected weaknesses.

If you, like me, have been in the computer security world long enough, you’ve seen a lot of security tech come and go. It’s almost to the point where you can start to predict what will stick and be improved and what will sooner or later become obsolete. The pace of change in attacks and technology alike mean that even so-called cutting-edge defenses, like biometric authentication and advanced firewalls, will eventually fail and go away. Surveying today's defense technologies, here's what I think is destined for the history books.

Biometric authentication is tantalizing cure-all for log-on security. After all, using your face, fingerprint, DNA, or some other biometric marker seems like the perfect log-on credential -- to someone who doesn't specialize in log-on authentication. As far as those experts are concerned, it’s not so much that biometric methods are rarely as accurate as most people think; it's more that, once stolen, your biometric markers can't be changed.

Take your fingerprints. Most people have only 10. Anytime your fingerprints are used as a biometric logon, those fingerprints -- or, more accurately, the digital representations of those fingerprints -- must be stored for future log-on comparison. Unfortunately, log-on credentials are far too often compromised or stolen. If the bad guy steals the digital representation of your fingerprints, how could any system tell the difference between your real fingerprints and their previously accepted digital representations?

In that case, the only solution might be to tell every system in the world that might rely on your fingerprints to not rely on your fingerprints, if that were even possible. The same is true for any other biometric marker. You'll have a hard time repudiating your real DNA, face, retina scan, and so on if a bad player gets their hands on the digital representation of those biometric markers.

That doesn’t even take into account issues around systems that only allow you to logon if you use, say, your fingerprint when you can no longer reliably use your fingerprint. What then?

Biometric markers used in conjunction with a secret only you know (password, PIN, and so on) are one way to defeat hackers that have your biometric logon marker. Of course mental secrets can be captured as well, as happens often with nonbiometric two-factor log-on credentials like smartcards and USB key fobs. In those instances, admins can easily issue you a new physical factor and you can pick a new PIN or password. That isn't the case when one of the factors is your body.

While biometric logons are fast becoming a trendy security feature, there's a reason they aren’t -- and won't ever be -- ubiquitous. Once people realize that biometric logons aren't what they pretend to be, they will lose popularity and either disappear, always require a second form of authentication, or only be used when high-assurance identification is not needed.

Doomed security technology No. 2: SSL

Secure Socket Layer was invented by long-gone Netscape in 1995. For two decades it served us adequately. But if you haven't heard, it is irrevocably broken and can't be repaired, thanks to the Poodle attack. SSL’s replacement, TLS (Transport Layer Security), is slightly better. Of all the doomed security tech discussed in this article, SSL is the closest to be being replaced, as it should no longer be used.

The problem? Hundreds of thousands of websites rely on or allow SSL. If you disable all SSL -- a common default in the latest versions of popular browsers -- all sorts of websites don't work. Or they will work, but only because the browser or application accepts "downleveling" to SSL. If it's not websites and browsers, then it's the millions of old SSH servers out there.

OpenSSH is seemingly constantly being hacked these days. While it’s true that about half of OpenSSH hacks have nothing to do with SSL, SSL vulnerabilities account for the other half. Millions of SSH/OpenSSH sites still use SSL even though they shouldn't.

Worse, terminology among tech pros is contributing to the problem, as nearly everyone in the computer security industry calls TLS digital certificates "SSL certs" though they don't use SSL. It's like calling a copy machine a Xerox when it's not that brand. If we’re going to hasten the world off SSL, we need to start calling TLS certs "TLS certs.

Make a vow today: Don't use SSL ever, and call Web server certs TLS certs. That's what they are or should be. The sooner we get rid of the word "SSL," the sooner it will be relegated to history's dustbin.

Doomed security technology No. 3: Public key encryption

This may surprise some people, but most of the public key encryption we use today -- RSA, Diffie-Hellman, and so on -- is predicted to be readable as soon as quantum computing and cryptography are figured out. Many, including this author, have been long (and incorrectly) predicting that usable quantum computing was mere years away. But when researchers finally get it working, most known public encryption ciphers, including the popular ones, will be readily broken. Spy agencies around the world have been saving encrypted secrets for years waiting for the big breakthrough -- or, if you believe some rumors, they already have solved the problem and are reading all our secrets.

Some crypto experts, like Bruce Schneier, have long been dubious about the promise of quantum cryptography. But even the critics can't dismiss the likelihood that, once it's figured out, any secret encrypted by RSA, Diffie-Hellman, and even ECC are immediately readable.

That's not to say there aren't quantum-resistant cipher algorithms. There are a few, including lattice-based cryptography and Supersingular Isogeny Key Exchange. But if your public cipher isn't one of those, you're out of luck if and when quantum computing becomes widespread.

Doomed security technology No. 4: IPsec
When enabled, IPsec allows all network traffic between two or more points to be cryptographically protected for packet integrity and privacy, aka encrypted. Invented in 1993 and made an open standard in 1995, IPsec is widely supported by hundreds of vendors and used on millions of enterprise computers.

Unlike most of the doomed security defenses discussed in this article, IPsec works and works great. But its problems are two-fold.

First, although widely used and deployed, it has never reached the critical mass necessary to keep it in use for much longer. Plus, IPsec is complex and isn't supported by all vendors. Worse, it can often be defeated by only one device in between the source and destination that does not support it -- such as a gateway or load balancer. At many companies, the number of computers that get IPsec exceptions is greater than the number of computers forced to use it.

IPsec's complexity also creates performance issues. When enabled, it can significantly slow down every connection using it, unless you deploy specialized IPsec-enabled hardware on both sides of the tunnel. Thus, high-volume transaction servers such as databases and most Web servers simply can’t afford to employ it. And those two types of servers are precisely where most important data resides. If you can't protect most data, what good is it?

Plus, despite being a "common" open standard, IPsec implementations don't typically work between vendors, another factor that has slowed down or prevented widespread adoption of IPsec.

But the death knell for IPsec is the ubiquity of HTTPS. When you have HTTPS enabled, you don't need IPsec. It's an either/or decision, and the world has spoken. HTTPS has won. As long as you have a valid TLS digital certificate and a compatible client, it works: no interoperability problems, low complexity. There is some performance impact, but it’s not noticeable to most users. The world is quickly becoming a default world of HTTPS. As that progresses, IPsec dies.

Doomed security technology No. 5: Firewalls

The ubiquity of HTTPS essentially spells the doom of the traditional firewall. I wrote about this in 2012, creating a mini-firestorm that won me invites to speak at conferences all over the world.

Some people would say I was wrong. Three years later, firewalls are still everywhere. True, but most aren't configured and almost all don't have the "least permissive, block-by-default" rules that make a firewall valuable in the first place. Most firewalls I come across have overly permissive rules. I often see "Allow All ANY ANY" rules, which essentially means the firewall is worse than useless. It's doing nothing but slowing down network connections.

Anyway you define a firewall, it must include some portion that allows only specific, predefined ports in order to be useful. As the world moves to HTTPS-only network connections, all firewalls will eventually have only a few rules -- HTTP/HTTPS and maybe DNS. Other protocols, such ads DNS, DHCP, and so on, will likely start using HTTPS-only too. In fact, I can't imagine a future that doesn't end up HTTPS-only. When that happens, what of the firewall?

The main protection firewalls offer is to secure against a remote attack on a vulnerable service. Remotely vulnerable services, usually exploited by one-touch, remotely exploitable buffer overflows, used to be among the most common attacks. Look at the Robert Morris Internet worm, Code Red, Blaster, and SQL Slammer. But when's the last time you heard of a global, fast-acting buffer overflow worm? Probably not since the early 2000s, and none of those were as bad as the worms from the 1980s and 1990s. Essentially, if you don't have an unpatched, vulnerable listening service, then you don't need a traditional firewall -- and right now you don't. Yep, you heard me right. You don't need a firewall.

Firewall vendors often write to tell me that their "advanced" firewall has features beyond the traditional firewall that makes theirs worth buying. Well, I've been waiting for more than two decades for "advanced firewalls" to save the day. It turns out they don't. If they perform "deep packet inspection" or signature scanning, it either slows down network traffic too much, is rife with false positives, or scans for only a small subset of attacks. Most "advanced" firewalls scan for a few dozen to a few hundred attacks. These days, more than 390,000 new malware programs are registered every day, not including all the hacker attacks that are indistinguishable from legitimate activity.

Even when firewalls do a perfect job at preventing what they say they prevent, they don't really work, given that they don't stop the two biggest malicious attacks most organizations face on a daily basis: unpatched software and social engineering.

Put it this way: Every customer and person I know currently running a firewall is as hacked as someone who doesn't. I don't fault firewalls. Perhaps they worked so well back in the day that hackers moved on to other sorts of attacks. For whatever reason, firewalls are nearly useless today and have been trending in that direction for more than a decade.

Doomed security technology No. 6: Antivirus scanners

Depending on whose statistics you believe, malware programs currently number in the tens to hundreds of millions -- an overwhelming fact that has rendered antivirus scanners nearly useless.

Not entirely useless, because they stop 80 to 99.9 percent of attacks against the average user. But the average user is exposed to hundreds of malicious programs every year; even with the best odds, the bad guy wins every once in a while. If you keep your PC free from malware for more than a year, you've done something special.

That isn’t to say we shouldn’t applaud antivirus vendors. They've done a tremendous job against astronomical odds. I can't think of any sector that has had to adjust to the kinds of overwhelming progressive numbers and advances in technology since the late 1980s, when there were only a few dozen viruses to detect.

But what will really kill antivirus scanners isn't this glut of malware. It's whitelisting. Right now the average computer will run any program you install. That's why malware is everywhere. But computer and operating system manufacturers are beginning to reset the "run anything" paradigm for the safety of their customers -- a movement that is antithetical to antivirus programs, which allow everything to run unimpeded except for programs that contain one of the more than 500 million known antivirus signatures. “Run by default, block by exception” is giving way to “block by default, allow by exception.”

Of course, computers have long had whitelisting programs, aka application control programs. I reviewed some of the more popular products back in 2009. The problem: Most people don't use whitelisting, even when it’s built in. The biggest roadblock? The fear of what users will do if they can't install everything they want willy-nilly or the big management headache of having to approve every program that can be run on a user’s system.

But malware and hackers are getting more pervasive and worse, and vendors are responding by enabling whitelisting by default. Apple's OS X introduced a near version of default whitelisting three years ago with Gatekeeper. iOS devices have had near-whitelisting for much longer in that they can run only approved applications from the App Store (unless the device is jailbroken). Some malicious programs have slipped by Apple, but the process has been incredibly successful at stopping the huge influx that normally follows popular OSes and programs.

Microsoft has long had a similar mechanism, through Software Restriction Policies and AppLocker, but an even stronger push is coming in Windows 10 with DeviceGuard. Microsoft’s Windows Store also offers the same protections as Apple's App Store. While Microsoft won't be enabling DeviceGuard or Windows Store-only applications by default, the features are there and are easier to use than before.

Once whitelisting becomes the default on most popular operating systems, it's game over for malware and, subsequently, for antivirus scanners. I can't say I'll miss either.

Doomed security technology No. 7: Antispam filters

Spam still makes up more than half of the Internet's email. You might not notice this anymore, thanks to antispam filters, which have reached levels of accuracy that antivirus vendors can only claim to deliver. Yet spammers keep spitting out billions of unwanted messages each day. In the end, only two things will ever stop them: universal, pervasive, high-assurance authentication and more cohesive international laws.

Spammers still exist mainly because we can't easily catch them. But as the Internet matures, pervasive anonymity will be replaced by pervasive high-assurance identities. At that point, when someone sends you a message claiming to have a bag of money to mail you, you will be assured they are who they say they are.

High-assurance identities can only be established when all users are required to adopt two-factor (or higher) authentication to verify their identity, followed by identity-assured computers and networks. Every cog in between the sender and the receiver will have a higher level of reliability. Part of that reliability will be provided by pervasive HTTPS (discussed above), but it will ultimately require additional mechanisms at every stage of authentication to assure that when I say I'm someone, I really am that someone.

Today, almost anyone can claim to be anyone else, and there's no universal way to verify that person's claim. This will change. Almost every other critical infrastructure we rely on -- transportation, power, and so on -- requires this assurance. The Internet may be the Wild West right now, but the increasingly essential nature of the Internet as infrastructure virtually ensures that it will eventually move in the direction of identity assurance.

Meanwhile, the international border problem that permeates nearly every online-criminal prosecution is likely to be resolved in the near future. Right now, many major countries do not accept evidence or warrants issued by other countries, which makes arresting spammers (and other malicious actors) nearly impossible. You can collect all the evidence you like, but if the attacker’s home country won't enforce the warrant, your case is toast.

As the Internet matures, however, countries that don't help ferret out the Internet's biggest criminals will be penalized. They may be placed on a blacklist. In fact, some already are. For example, many companies and websites reject all traffic originating from China, whether it's legitimate or not. Once we can identify criminals and their home countries beyond repudiation, as outlined above, those home countries will be forced to respond or suffer penalties.

The heyday of the spammers where most of their crap reached your inbox is already over. Pervasive identities and international law changes will close the coffin lid on spam -- and the security tech necessary to combat it.

Doomed security technology No. 8: Anti-DoS protections

Thankfully, the same pervasive identity protections mentioned above will be the death knell for denial-of-service (DoS) attacks and the technologies that have arisen to quell them.

These days, anyone can launch free Internet tools to overwhelm websites with billions of packets. Most operating systems have built-in anti-DoS attack protections, and more than a dozen vendors can protect your websites even when being hit by extraordinary amounts of bogus traffic. But the loss of pervasive anonymity will stop all malicious senders of DoS traffic. Once we can identify them, we can arrest them.

Think of it this way: Back in the 1920s there were a lot of rich and famous bank robbers. Banks finally beefed up their protection, and cops got better at identifying and arresting them. Robbers still hit banks, but they rarely get rich, and they almost always get caught, especially when they persist in robbing more banks. The same will happen to DoS senders. As soon as we can quickly identify them, the sooner they will disappear as the bothersome elements of society that they are.

Doomed security technology No. 9: Huge event logs

Computer security event monitoring and alerting is difficult. Every computer is easily capable of generating tens of thousands of events on its own each day. Collect them to a centralized logging database and pretty soon you're talking petabytes of needed storage. Today's event log management systems are often lauded for the vast size of their disk storage arrays.

The only problem: This sort of event logging doesn't work. When nearly every collected event packet is worthless and goes unread, and the cumulative effect of all the worthless unread events is a huge storage cost, something has to give. Soon enough admins will require application and operating system vendors to give them more signal and less noise, by passing along useful events without the mundane log clutter. In other words, event log vendors will soon be bragging about how little space they take rather than how much.

Doomed security technology No. 10: Anonymity tools (not to mention anonymity and privacy)

Lastly, any mistaken vestige of anonymity and privacy will be completely wiped away. We already really don't have it. The best book I can recommend on the subject is Bruce Schneier's "Data and Goliath." A quick read will scare you to death if you didn't already realize how little privacy and anonymity you truly have.

Even hackers who think that hiding on Tor and other "darknets" give them some semblance of anonymity must understand how quickly the cops are arresting people doing bad things on those networks. Anonymous kingpin after anonymous kingpin ends up being arrested, identified in court, and serving real jail sentences with real jail numbers attached to their real identity.

The truth is, anonymity tools don't work. Many companies, and certainly law enforcement, already know who you are. The only difference is that, in the future, everyone will know the score and stop pretending they are staying hidden and anonymous online.

I would love for a consumer's bill of rights guaranteeing privacy to be created and passed, but past experience teaches me that too many citizens are more than willing to give up their right to privacy in return for supposed protection. How do I know? Because it's already the standard everywhere but the Internet. You can bet the Internet is next.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Tuesday, 18 August 2015

Windows 10's usage share ascent routs Windows 7's

Early data shows Microsoft's new OS has come out of the gate faster than the venerable Windows 7 did six years ago

Microsoft's Windows 10 has easily surpassed the global usage share ascension of Windows 7 six years ago, putting it on track to become the company's most successful OS introduction ever.

Windows 10's usage share neared 6.6% on Sunday, data from analytics vendor StatCounter showed. That was a 23% increase over what the OS logged the Sunday prior, continuing the unbroken stretch of double-digit-or-more, week-over-week increases since Windows 10's July 29 debut.
Computerworld's Best Places to Work in IT 2015: Company Listings

The complete listings: Computerworld's 100 Best Places to Work in IT for 2015

A compact list of the 56 large, 18 midsize and 26 small organizations that ranked as Computerworld's

The operating system's usage share increase has comfortably exceeded that of its closest rival, Windows 7, during the latter's first 18 days of availability in the fall of 2009. Windows 7's high-water mark during that stretch was 3.8% by StatCounter's count.

StatCounter estimates what Computerworld has dubbed "usage share" by tallying page views, making the Irish metric firm's numbers a signal of activity on the Internet rather than of users.

Windows 7's uptake as measured by StatCounter was quicker off the mark than was Windows 10's, a fact that should not come as a surprise, since the former was released as a paid upgrade that users had been eagerly buying in the run-up to its Oct. 22, 2009, official launch.

Microsoft treated Windows 10 differently, choosing to not only give away the upgrade to hundreds of millions of Windows 7 and Windows 8.1 users, but delivering that upgrade to customer subsets over time, part of an effort to minimize stress on the Redmond, Wash. company's content distribution network and allow it to track problems, then presumably fix them, before triggering upgrade notifications for the next "wave."

By the fourth day after its introduction, Windows 10's usage share had topped that of Windows 7 at the same point in its post-debut schedule. Windows 10 has not relinquished the lead since then.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Wednesday, 12 August 2015

Sorriest technology companies of 2015

A rundown of the year in apologies from tech vendors and those whose businesses rely heavily on tech.

Sorry situation
Despite all the technology advances that have rolled out this year, it’s also been a sorry state of affairs among leading network and computing vendors, along with businesses that rely heavily on technology. Apple, Google, airlines and more have issued tech-related mea culpas in 2015…

Sony says Sorry by saying Thanks
Network outages caused by DDoS attacks spoiled holiday fun for those who got new PlayStation 4 games and consoles, so Sony kicked off 2015 with an offer of 10% off new purchases, plus an extended free trial for some.

NSA’s backdoor apology
After getting outted by Microsoft and later Edward Snowden for allowing backdoors to be inserted into devices via a key security standard, the NSA sort of apologized. NSA Director of Research Michael Wertheimer, in writing for the Notices of the American Mathematical Society, acknowledges mistakes were made in “The Mathematics Community and the NSA.” He wrote in part: “With hindsight, NSA should have ceased supporting the Dual_EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor.”

You probably forgot about this flag controversy
China’s big WeChat messaging service apologized in January for bombarding many of its hundreds of millions of users – and not just those in the United States -- with Stars and Stripes icons whenever they typed in the words “civil rights” on Martin Luther King, Jr. Day. WeChat also took heat for not offering any sort of special icons when users typed in patriotic Chinese terms. The special flag icons were only supposed to have been seen by US users of the service.

Go Daddy crosses the line
Web site domain provider Go Daddy as usual relied on scantily clad women as well as animals to spread its message during this past winter’s Super Bowl. The surprising thing is that the animals are what got the company in hot water this time. The company previewed an ad that was supposed to parody Budweiser commercials, but its puppy mill punch line didn’t have many people laughing, so the CEO wound up apologizing and pulling the ad.

Name calling at Comcast
Comcast scrambled to make right after somehow changing the name of a customer on his bill to “(expletive… rhymes with North Pole) Brown” from his actual name, Ricardo Brown. The change took place after Brown’s wife called Comcast to discontinue cable service. The service provider told a USA Today columnist that it was investigating the matter, but in the meantime was refunding the Browns for two years of previous service.

Where to start with Google?
Google’s Department of Apologies has been busy this year: In January the company apologized when its translation services spit out anti-gay slurs in response to searches on the terms “gay” and “homosexual.” In May, Google apologized after a Maps user embedded an image of the Android mascot urinating on Apple’s logo. This summer, Google has apologized for its new Photos app mislabeling African Americans as “gorillas” and for Google Niantic Labs’ Ingress augmented reality game including the sites of former Nazi concentration camps as points of interest.

Carnegie Mellon admissions SNAFU
Carnegie Mellon University’s Computer Science School in February apologized after it mistakenly accepted 800 applicants to its grad problem, only to send out rejection notices hours later. The irony of a computer glitch leading to this problem at such a renowned computer science school was lost on no one…

Lenovo Superfish debacle
Lenovo officials apologized in February after it was discovered that Superfish adware packaged with some of its consumer notebooks was not only a pain for users but also included a serious security flaw resulting from interception of encrypted traffic. “I have a bunch of very embarrassed engineers on my staff right now,” said Lenovo CTO Peter Hortensius. “They missed this.” Lenovo worked with Microsoft and others to give users tools to rid themselves of Superfish.

Apple apologizes for tuning out customers
Apple apologized in March for an 11-hour iTunes service and App Store outage that it blamed on “an internal DNS error at Apple,” in a statement to CNBC.

Blame the iPads
American Airlines in April apologized after digital map application problems on pilot iPads delayed dozens of flights over a two-day period. The airline did stress that the problem was a third-party app, not the Apple products themselves.

Locker awakened
The creator of a strain of ransomware called Locker apologized after he “woke up” the malware, which encrypted files on infected devices and asked for money to release them. A week after the ransomware was activated, the creator apparently had a changed of heart released decryption keys needed by victims to unlock their systems.

HTC wants to be Hero
Phonemaker HTC’s CEO Cher Wang, according to the Taipei Times in June, apologized to investors in June after the company’s new One M9 flagship phone failed to boost sales. “HTC’s recent performance has let people down,” said Wang, pointing to better times ahead with the planned fall release of a new phone dubbed Hero.

Ketchup for adults only
Ketchup maker Heinz apologized in June after an outdated contest-related QR code on its bottles sent a German man to an X-rated website. Meanwhile, the website operator offered the man who complained a free year’s worth of access, which he declined.

Livid Reddit users push out interim CEO
Interim Reddit CEO Ellen Pao apologized in July (“we screwed up”) after the online news aggregation site went nuts over the sudden dismissal of an influential employee known for her work on the site’s popular Ask Me Anything section. Pao shortly afterwards resigned from her post following continued demands for her ouster by site users.

Blame the router
United Airlines apologized (“we experienced a network connectivity issue. We are working to resolve and apologize for any inconvenience.”) in July after being forced to ground its flights for two hours one morning due to a technology issue that turned out to be router-related. United has suffered a string of tech glitches since adopting Continental’s passenger management system a few years back following its acquisition of the airline.

Billion dollar apology
Top Toshiba executives resigned in July following revelations that the company had systematically padded its profits by more than $1 billion over a six-year period. “I recognize there has been the most serious damage to our brand image in our 140-year history,” said outgoing President Hisao Tanaka, who is to be succeeded by Chairman Masashi Muromachi. “We take what the committee has pointed out very seriously, and it is I and others in management who bear responsibility.”

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Saturday, 1 August 2015

Top 5 factors driving domestic IT outsourcing growth

Despite insourcing efforts, the expansion of nearshore centers is not necessarily taking work away from offshore locations. Eric Simonson of the Everest Group discusses the five main drivers responsible for the rise in domestic outsourcing, why Indian providers dominate the domestic landscape and more.

IT service providers placed significant focus on staffing up their offshore delivery centers during the previous decade. However, over the past five years, outsourcing providers have revved their U.S. domestic delivery center activity, according to recent research by outsourcing consultancy and research firm Everest Group.

The American outsourcing market currently employs around 350,000 full-time professionals and is growing between three and 20 percent a year depending on function, according to Everest Group’s research.

Yet the expansion of nearshore centers is not necessarily taking work away from offshore locations in India and elsewhere. Big insourcing efforts, like the one announced by GM, remain the exception. Companies are largely sticking with their offshore locations for existing non-voice work and considering domestic options for new tasks, according to Eric Simonson, Everest Group’s managing partner for research.

We spoke to Simonson about the five main drivers for domestic outsourcing growth, the types of IT services growing stateside, why Indian providers dominate the domestic landscape, and the how providers plan to meet the growing demand for U.S. IT services skills.

Interest in domestic IT outsourcing is on the rise, but you say that that does not indicate any dissatisfaction with the offshore outsourcing model.

Simonson: This isn’t about offshore not working and companies deciding to bring the work back. That’s happening a bit with some call center and help desk functions. But, by and large, these delivery center setups are more about bringing the wisdom of global delivery into the domestic market. The fundamental goal is industrializing the onshore model vs. fixing what’s broken offshore.

Can you talk about the five main drivers behind their increased interest in locating stateside?
Simonson: The first is diversification of buyer needs. As buyers have to support new types of services, certain types of tasks may be better delivered nearshore rather than offshore.

Secondly, there may be a desire to leverage the soft skills of onshore talent. This occurs when you need someone with a certain type of domestic business knowledge or dialect or cultural affinity.

Thirdly, domestic sourcing can be a way to overcome the structural challenges associated with offshore delivery, such as high attrition and burn out in graveyard shifts.

Fourth, companies may be seeking to manage certain externalities like regulatory requirements of fears about visa availabilities. To some extent, these reasons are often not necessarily based on true requirements, but are a convenient reason to give for choosing to outsource domestically rather than the potential risks of offshore.

Finally, there may be client-specific needs that demand domestic solutions—a local bank that wants to keep jobs in the community or a company with no experience offshore looking to start the learning curve.

Within IT services, what types of work currently dominate the domestic landscape?
Simonson: Application development is most prominent, with 123 domestic delivery centers in tier-one and -two cities serving financial services, public sector, manufacturing, retail and consumer packaged goods clients. Just behind that is IT infrastructure in similar geographies focused on those verticals as well. There are 80 consulting and systems integration centers and 68 testing centers as well.

It’s interesting to note that while U.S.-based providers tend to operate larger IT service centers domestically, it’s actually the Indian providers that dominate the landscape.
security tools 1

Simonson: Traditional U.S.-based multinationals have captured more scale in individual centers and have been able to grow them, in some ways, more strategically. They’ve been able to set up shop in smaller tier-4 cities like Ann Arbor or Des Moines and have more proven local talent models.

But the majority of domestic centers are operated by India-centric providers. Part of that is driven by their desire to get closer to their customers. With application and systems integration work, the ability to work more closely with the client is increasingly valuable. And with infrastructure work, concerns about data and systems access have encouraged Indian companies to offer more onshore options.

In addition, some of the bad press they’ve received related to visa issues is encouraging them to balance out their delivery center portfolios.

But Indian providers are not necessarily staffing up their centers with American workers.
Simonson: Indian providers are more likely to use visas to bring citizens of other countries (predominantly India) into the country to work on a temporary or permanent basis in a delivery center. About 32 percent of their domestic workforce working in delivery centers is comprised of these ‘landed resources.’ Across all providers, landed resources account for six percent of domestic service delivery employees. However, tightening visa norms and higher visa rejection rates are making it more difficult for providers to rely on foreign workers.

You found that approximately 43 percent of the delivery centers are located in the South, with almost half of those concentrated in the South Atlantic. And Texas has more than fifty. Is that
simply due to the fact that it’s cheaper to operate there?

Simonson: Cheap helps. But equally important are overall population trends. The South is growing, while regions like the Northeast or Midwest are either stable or on the decline. If you look at where people are going to school or moving and where corporations are relocating their headquarters, it’s taking place from the Carolinas down through Florida and over through Arkansas, Oklahoma and Texas. Those states are also more progressive about attracting services businesses (although there are some exceptions outside of the south like North Dakota and Missouri).

Do you expect the domestic IT outsourcing market to continue to grow?
Simonson: Yes, service providers expect an increase in demand for domestic outsourcing services by new and existing customers, and plan to increase their domestic delivery capabilities by adding more full time employees to their existing centers and establishing new delivery centers. In fact, 60 percent of delivery centers are planning to add headcount over the next three years with India-centric service providers expected to lead the expansion.

Tier-2 and tier-3 cities, like Orlando, Atlanta and Rochester, are poised for the greatest growth, with tier-1 and rural centers expecting the least amount of growth.

Will the supply of domestic IT talent keep up with this increased demand?
Simonson: The pressure to find IT talent has led service providers to adopt a range of approaches to extend their reach and develop ecosystems of talent. Many have developed educational partnerships, creating formal and informal relationships with colleges and technical institutes. They’re also basing themselves in cities known for their quality of life and recruiting entry-level and experienced talent from elsewhere. It all impacts what communities they decide to work in.

All service providers will have to expand their talent pools, particularly in IT. Automation of some tasks could increase capacity, but doesn’t provide the higher-complexity skills that are most valued onshore.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, 13 July 2015

The Apple Watch disrupts, but is that enough?

For some it's a must-have, but others may want to wait before committing

Disruptive technology doesn't come along often, and is often initially dismissed because it's easy to ignore something you've lived an entire life without. But every once in a while a bit of tech comes along that makes it easier to do what you're already doing.

This is the Apple Watch.
I wasn't always sold on the concept. Aside from issues related to appearance/style, functionality, personalization, fitness tracking, and useful interaction methods, my big concern was this: What real-world problem would an Apple watch solve? Knowing the obstacles was one thing; solving those problems was something else entirely. I was skeptical.

The engineers at Apple not only understood those issues but figured out solutions. By the time Apple execs finished unveiling their vision for the modern watch last September, I was ready to give the technology a shot. As someone who's built a career around tech, I couldn't remember the last time a watch of any type inspired an emotional reaction.

Much of my excitement stemmed from the new technologies, especially the Digital Crown and Force Touch, both of which work wonderfully in the real world.

Crowning achievement

With the Digital Crown, Apple engineers turned a feature already present in watches into a scroll wheel for selecting options and quickly sliding through list views. It's used to access apps, very much like an iPhone's Home Button, when pressed. Double-pressing it switches between the last-used app and the Clock app; holding the Crown down activates Siri; and when you use it to scroll to the end of a list, it even becomes harder to turn. (That last feature shows the obsessive level of detail that's characteristic of Apple.)

Apple
Handing off scrolling and button-like functionality to the crown is so obvious -- in retrospect -- that it's amazing no one came up with the idea beforehand. This is typical Apple.

With Force Touch, the Apple Watch's Retina display can respond not only to touch and gestures, but can sense when additional force is applied to the screen. That extra pressure brings up additional options in supported apps: It can call up app settings, dismiss notifications, pause or end workouts, select audio and video sources in Remote, and customize Watch faces. The cleverness of Force Touch is that these actions would otherwise need their own onscreen icons, using up precious space in a device with limited screen real-estate.

Force Touch works so well in the real world that the technology has started spreading to other Apple products, like the latest MacBooks and MacBook Pro laptops. It's only a matter of time before iPads and iPhones get this, too.

Uniting and adding to these new technologies is a tried-and-true method that underpins the success of the Watch: Siri. On the Watch, Siri is used for all sorts of voice commands, like setting timers, checking weather, launching apps -- as well as for dictating messages. The Apple Watch relies on Siri for functions that would normally require a keyboard; without Siri, the Watch would fail.

These three technologies allow the Watch to stand above competitors' offerings. Physically, though, the Watch has the distinction of actually looking like a Watch -- and a nice watch at that. It's not embarrassing to wear, regardless of the occasion. Watch Bands can be removed and swapped out easily and the number of Watch/band combos continues to rise.

Apple Watch makes technology as fashionable as possible, more so than any previous attempts in the category from anyone else. But, while it (debatably) looks great -- especially for a wearable computer -- the key to usability (and success) is software: the Watch operating system, apps and ecosystem.
Fitness and notifications

When I got my Apple Watch in April, I was looking for it to do two things: be a fitness accessory/advisor and a notification system for important alerts. However, I underestimated the importance of apps. There are well over 4,000 now available, with more coming. Currently, apps have flaws -- many are still slow to load, and the display will often turn off before they load fully -- but that should improve significantly with native app support, which is coming this fall with the Watch OS 2.0 update. That update promises faster app launches and developer access to features not available to them now, including the accelerometer and the heart rate monitor. There will also be support for non-Apple Complications, and Night Stand mode (which works wonderfully with my favorite stand from Nomad).

In 2007, when the first iPhone was released, I wrote about a digital future where data is at your fingertips. That future is now; We're living the mobile dream, with devices like the iPhone designed with portability and instant access to all sorts of information. That also means a world in which our devices never shut up. In practical use, this is one of the areas the Watch truly shines: filtering digital noise.

The Watch is clearly the type of product that grows on you. I'm still using my iPhone; the Watch hasn't made it obsolete, especially because it relies on the phone for so much backend work. But when I pull the iPhone out, it's for different reasons now. I can quickly respond to texts, control music, check my calendar for upcoming events, track packages, check on the order status of Apple Store purchases, and get directions via the Watch without getting sucked into other apps -- which happens when I pick up the iPhone.

This is a big deal for me. The iPhone, with all it can do, is a gigantic time-suck, and it's easy to fall into the trap. The Watch is designed for short bursts of interactions, without the distractions inherent to a device that does just about everything.

Fitness tracking is still a huge deal for me, but as someone who uses the Watch to track running, basketball, and especially weight lifting, I'm not very impressed. While the Watch has excellent heart rate monitoring sensors, they only work well if you're using it to track an activity in which your arms wave about. In those cases, the Watch is spot on.
Weightlifters need not apply

Tracking activities like lifting weights or pushups is another matter, and here is where the Watch falls on its proverbial face. If you're an active weight-lifter and are in the market for a fitness tracker, this isn't it. When lifting weights, the heart monitoring is the worst feature of the Watch. It's supposed to monitor your heart rate every 10 minutes in normal mode, and every 10 seconds during a workout. But when Apple released the 1.0.1 update, it changed that behavior so that if the Watch senses movement in normal mode, it skips the heart rate reading. This is absurd. The opposite should occur: if the Watch senses sustained, increased movement, the correct response is to instantly check pulse rate to gauge exertion levels. (The inaccurate readings while lifting weights is a known issue and is supposed to be resolved with a future software update, but who knows when.)

Apple Watch BPM

What isn't disappointing, though, is that the Watch is more water proof than I thought. I've used the Watch in showers, hot tubs, and while swimming. I didn't dive beyond 15 feet, but I wore it while playing basketball in a pool, and I was in the water for hours. Do I recommend getting it wet? Not really, and neither does Apple. But you can. (The Watch is rated to survive 30 minutes at one meter's depth.)

The technology in the Apple Watch will, of course, improve with each successive software update (and each new generation of the Watch itself). Even so, the Watch already marks the first time technology as fashion has sold in large numbers. When I wrote my first iPhone review, I said that breakthrough products like this really leave an imprint in time, in which we can literally see the pivot point: before and after. Even though I'm disappointed in tracking an activity like lifting weights, the Watch is that kind of product.

The more people purchase and use the Watch, the more attention the device will get from third-party developers and service providers. There will come a point when the number of wearers will be hard to ignore forcing businesses and third parties to support the services those wearers expect, especially something like Apple Pay.

But is that today?

So, should you get one?

I'm in an interesting position regarding whether I recommend the Watch. At this point, you likely know whether or not you want a Watch. Apple has already sold more of them in a few weeks than all of the competition sold in years, and I'm clearly a fan (as are other Watch owners I know). But it's still too soon to know whether the functions and fashion it offers -- or will offer in future iterations -- will be enough to lure the hoards of new users that follow early adopters.

Two years ago I figured if an Apple Watch were ever released, it would be because Apple leaders were confident of its impact. I said then that I'd have to see it to believe it.

Well, I've seen it, I've used it, and I'm a believer: Despite the first-generation problems, you can have my Watch after you pry it from my cold, dead wrist.

Comptia A+ Training, Comptia A+ certification

Best comptia A+ Training, Comptia A+ Certification at Certkingdom.com