Monday 31 August 2015

10 security technologies destined for the dustbin

Systemic flaws and a rapidly shifting threatscape spell doom for many of today’s trusted security technologies

Perhaps nothing, not even the weather, changes as fast as computer technology. With that brisk pace of progress comes a grave responsibility: securing it.

Every wave of new tech, no matter how small or esoteric, brings with it new threats. The security community slaves to keep up and, all things considered, does a pretty good job against hackers, who shift technologies and methodologies rapidly, leaving last year’s well-recognized attacks to the dustbin.

Have you had to enable the write-protect notch on your floppy disk lately to prevent boot viruses or malicious overwriting? Have you had to turn off your modem to prevent hackers from dialing it at night? Have you had to unload your ansi.sys driver to prevent malicious text files from remapping your keyboard to make your next keystroke reformat your hard drive? Did you review your autoexec.bat and config.sys files to make sure no malicious entries were inserted to autostart malware?

Not so much these days -- hackers have moved on, and the technology made to prevent older hacks like these is no longer top of mind. Sometimes we defenders have done such a good job that the attackers decided to move on to more fruitful options. Sometimes a particular defensive feature gets removed because the good guys determined it didn't protect that well in the first place or had unexpected weaknesses.

If you, like me, have been in the computer security world long enough, you’ve seen a lot of security tech come and go. It’s almost to the point where you can start to predict what will stick and be improved and what will sooner or later become obsolete. The pace of change in attacks and technology alike mean that even so-called cutting-edge defenses, like biometric authentication and advanced firewalls, will eventually fail and go away. Surveying today's defense technologies, here's what I think is destined for the history books.

Biometric authentication is tantalizing cure-all for log-on security. After all, using your face, fingerprint, DNA, or some other biometric marker seems like the perfect log-on credential -- to someone who doesn't specialize in log-on authentication. As far as those experts are concerned, it’s not so much that biometric methods are rarely as accurate as most people think; it's more that, once stolen, your biometric markers can't be changed.

Take your fingerprints. Most people have only 10. Anytime your fingerprints are used as a biometric logon, those fingerprints -- or, more accurately, the digital representations of those fingerprints -- must be stored for future log-on comparison. Unfortunately, log-on credentials are far too often compromised or stolen. If the bad guy steals the digital representation of your fingerprints, how could any system tell the difference between your real fingerprints and their previously accepted digital representations?

In that case, the only solution might be to tell every system in the world that might rely on your fingerprints to not rely on your fingerprints, if that were even possible. The same is true for any other biometric marker. You'll have a hard time repudiating your real DNA, face, retina scan, and so on if a bad player gets their hands on the digital representation of those biometric markers.

That doesn’t even take into account issues around systems that only allow you to logon if you use, say, your fingerprint when you can no longer reliably use your fingerprint. What then?

Biometric markers used in conjunction with a secret only you know (password, PIN, and so on) are one way to defeat hackers that have your biometric logon marker. Of course mental secrets can be captured as well, as happens often with nonbiometric two-factor log-on credentials like smartcards and USB key fobs. In those instances, admins can easily issue you a new physical factor and you can pick a new PIN or password. That isn't the case when one of the factors is your body.

While biometric logons are fast becoming a trendy security feature, there's a reason they aren’t -- and won't ever be -- ubiquitous. Once people realize that biometric logons aren't what they pretend to be, they will lose popularity and either disappear, always require a second form of authentication, or only be used when high-assurance identification is not needed.

Doomed security technology No. 2: SSL

Secure Socket Layer was invented by long-gone Netscape in 1995. For two decades it served us adequately. But if you haven't heard, it is irrevocably broken and can't be repaired, thanks to the Poodle attack. SSL’s replacement, TLS (Transport Layer Security), is slightly better. Of all the doomed security tech discussed in this article, SSL is the closest to be being replaced, as it should no longer be used.

The problem? Hundreds of thousands of websites rely on or allow SSL. If you disable all SSL -- a common default in the latest versions of popular browsers -- all sorts of websites don't work. Or they will work, but only because the browser or application accepts "downleveling" to SSL. If it's not websites and browsers, then it's the millions of old SSH servers out there.

OpenSSH is seemingly constantly being hacked these days. While it’s true that about half of OpenSSH hacks have nothing to do with SSL, SSL vulnerabilities account for the other half. Millions of SSH/OpenSSH sites still use SSL even though they shouldn't.

Worse, terminology among tech pros is contributing to the problem, as nearly everyone in the computer security industry calls TLS digital certificates "SSL certs" though they don't use SSL. It's like calling a copy machine a Xerox when it's not that brand. If we’re going to hasten the world off SSL, we need to start calling TLS certs "TLS certs.

Make a vow today: Don't use SSL ever, and call Web server certs TLS certs. That's what they are or should be. The sooner we get rid of the word "SSL," the sooner it will be relegated to history's dustbin.

Doomed security technology No. 3: Public key encryption

This may surprise some people, but most of the public key encryption we use today -- RSA, Diffie-Hellman, and so on -- is predicted to be readable as soon as quantum computing and cryptography are figured out. Many, including this author, have been long (and incorrectly) predicting that usable quantum computing was mere years away. But when researchers finally get it working, most known public encryption ciphers, including the popular ones, will be readily broken. Spy agencies around the world have been saving encrypted secrets for years waiting for the big breakthrough -- or, if you believe some rumors, they already have solved the problem and are reading all our secrets.

Some crypto experts, like Bruce Schneier, have long been dubious about the promise of quantum cryptography. But even the critics can't dismiss the likelihood that, once it's figured out, any secret encrypted by RSA, Diffie-Hellman, and even ECC are immediately readable.

That's not to say there aren't quantum-resistant cipher algorithms. There are a few, including lattice-based cryptography and Supersingular Isogeny Key Exchange. But if your public cipher isn't one of those, you're out of luck if and when quantum computing becomes widespread.

Doomed security technology No. 4: IPsec
When enabled, IPsec allows all network traffic between two or more points to be cryptographically protected for packet integrity and privacy, aka encrypted. Invented in 1993 and made an open standard in 1995, IPsec is widely supported by hundreds of vendors and used on millions of enterprise computers.

Unlike most of the doomed security defenses discussed in this article, IPsec works and works great. But its problems are two-fold.

First, although widely used and deployed, it has never reached the critical mass necessary to keep it in use for much longer. Plus, IPsec is complex and isn't supported by all vendors. Worse, it can often be defeated by only one device in between the source and destination that does not support it -- such as a gateway or load balancer. At many companies, the number of computers that get IPsec exceptions is greater than the number of computers forced to use it.

IPsec's complexity also creates performance issues. When enabled, it can significantly slow down every connection using it, unless you deploy specialized IPsec-enabled hardware on both sides of the tunnel. Thus, high-volume transaction servers such as databases and most Web servers simply can’t afford to employ it. And those two types of servers are precisely where most important data resides. If you can't protect most data, what good is it?

Plus, despite being a "common" open standard, IPsec implementations don't typically work between vendors, another factor that has slowed down or prevented widespread adoption of IPsec.

But the death knell for IPsec is the ubiquity of HTTPS. When you have HTTPS enabled, you don't need IPsec. It's an either/or decision, and the world has spoken. HTTPS has won. As long as you have a valid TLS digital certificate and a compatible client, it works: no interoperability problems, low complexity. There is some performance impact, but it’s not noticeable to most users. The world is quickly becoming a default world of HTTPS. As that progresses, IPsec dies.

Doomed security technology No. 5: Firewalls

The ubiquity of HTTPS essentially spells the doom of the traditional firewall. I wrote about this in 2012, creating a mini-firestorm that won me invites to speak at conferences all over the world.

Some people would say I was wrong. Three years later, firewalls are still everywhere. True, but most aren't configured and almost all don't have the "least permissive, block-by-default" rules that make a firewall valuable in the first place. Most firewalls I come across have overly permissive rules. I often see "Allow All ANY ANY" rules, which essentially means the firewall is worse than useless. It's doing nothing but slowing down network connections.

Anyway you define a firewall, it must include some portion that allows only specific, predefined ports in order to be useful. As the world moves to HTTPS-only network connections, all firewalls will eventually have only a few rules -- HTTP/HTTPS and maybe DNS. Other protocols, such ads DNS, DHCP, and so on, will likely start using HTTPS-only too. In fact, I can't imagine a future that doesn't end up HTTPS-only. When that happens, what of the firewall?

The main protection firewalls offer is to secure against a remote attack on a vulnerable service. Remotely vulnerable services, usually exploited by one-touch, remotely exploitable buffer overflows, used to be among the most common attacks. Look at the Robert Morris Internet worm, Code Red, Blaster, and SQL Slammer. But when's the last time you heard of a global, fast-acting buffer overflow worm? Probably not since the early 2000s, and none of those were as bad as the worms from the 1980s and 1990s. Essentially, if you don't have an unpatched, vulnerable listening service, then you don't need a traditional firewall -- and right now you don't. Yep, you heard me right. You don't need a firewall.

Firewall vendors often write to tell me that their "advanced" firewall has features beyond the traditional firewall that makes theirs worth buying. Well, I've been waiting for more than two decades for "advanced firewalls" to save the day. It turns out they don't. If they perform "deep packet inspection" or signature scanning, it either slows down network traffic too much, is rife with false positives, or scans for only a small subset of attacks. Most "advanced" firewalls scan for a few dozen to a few hundred attacks. These days, more than 390,000 new malware programs are registered every day, not including all the hacker attacks that are indistinguishable from legitimate activity.

Even when firewalls do a perfect job at preventing what they say they prevent, they don't really work, given that they don't stop the two biggest malicious attacks most organizations face on a daily basis: unpatched software and social engineering.

Put it this way: Every customer and person I know currently running a firewall is as hacked as someone who doesn't. I don't fault firewalls. Perhaps they worked so well back in the day that hackers moved on to other sorts of attacks. For whatever reason, firewalls are nearly useless today and have been trending in that direction for more than a decade.

Doomed security technology No. 6: Antivirus scanners

Depending on whose statistics you believe, malware programs currently number in the tens to hundreds of millions -- an overwhelming fact that has rendered antivirus scanners nearly useless.

Not entirely useless, because they stop 80 to 99.9 percent of attacks against the average user. But the average user is exposed to hundreds of malicious programs every year; even with the best odds, the bad guy wins every once in a while. If you keep your PC free from malware for more than a year, you've done something special.

That isn’t to say we shouldn’t applaud antivirus vendors. They've done a tremendous job against astronomical odds. I can't think of any sector that has had to adjust to the kinds of overwhelming progressive numbers and advances in technology since the late 1980s, when there were only a few dozen viruses to detect.

But what will really kill antivirus scanners isn't this glut of malware. It's whitelisting. Right now the average computer will run any program you install. That's why malware is everywhere. But computer and operating system manufacturers are beginning to reset the "run anything" paradigm for the safety of their customers -- a movement that is antithetical to antivirus programs, which allow everything to run unimpeded except for programs that contain one of the more than 500 million known antivirus signatures. “Run by default, block by exception” is giving way to “block by default, allow by exception.”

Of course, computers have long had whitelisting programs, aka application control programs. I reviewed some of the more popular products back in 2009. The problem: Most people don't use whitelisting, even when it’s built in. The biggest roadblock? The fear of what users will do if they can't install everything they want willy-nilly or the big management headache of having to approve every program that can be run on a user’s system.

But malware and hackers are getting more pervasive and worse, and vendors are responding by enabling whitelisting by default. Apple's OS X introduced a near version of default whitelisting three years ago with Gatekeeper. iOS devices have had near-whitelisting for much longer in that they can run only approved applications from the App Store (unless the device is jailbroken). Some malicious programs have slipped by Apple, but the process has been incredibly successful at stopping the huge influx that normally follows popular OSes and programs.

Microsoft has long had a similar mechanism, through Software Restriction Policies and AppLocker, but an even stronger push is coming in Windows 10 with DeviceGuard. Microsoft’s Windows Store also offers the same protections as Apple's App Store. While Microsoft won't be enabling DeviceGuard or Windows Store-only applications by default, the features are there and are easier to use than before.

Once whitelisting becomes the default on most popular operating systems, it's game over for malware and, subsequently, for antivirus scanners. I can't say I'll miss either.

Doomed security technology No. 7: Antispam filters

Spam still makes up more than half of the Internet's email. You might not notice this anymore, thanks to antispam filters, which have reached levels of accuracy that antivirus vendors can only claim to deliver. Yet spammers keep spitting out billions of unwanted messages each day. In the end, only two things will ever stop them: universal, pervasive, high-assurance authentication and more cohesive international laws.

Spammers still exist mainly because we can't easily catch them. But as the Internet matures, pervasive anonymity will be replaced by pervasive high-assurance identities. At that point, when someone sends you a message claiming to have a bag of money to mail you, you will be assured they are who they say they are.

High-assurance identities can only be established when all users are required to adopt two-factor (or higher) authentication to verify their identity, followed by identity-assured computers and networks. Every cog in between the sender and the receiver will have a higher level of reliability. Part of that reliability will be provided by pervasive HTTPS (discussed above), but it will ultimately require additional mechanisms at every stage of authentication to assure that when I say I'm someone, I really am that someone.

Today, almost anyone can claim to be anyone else, and there's no universal way to verify that person's claim. This will change. Almost every other critical infrastructure we rely on -- transportation, power, and so on -- requires this assurance. The Internet may be the Wild West right now, but the increasingly essential nature of the Internet as infrastructure virtually ensures that it will eventually move in the direction of identity assurance.

Meanwhile, the international border problem that permeates nearly every online-criminal prosecution is likely to be resolved in the near future. Right now, many major countries do not accept evidence or warrants issued by other countries, which makes arresting spammers (and other malicious actors) nearly impossible. You can collect all the evidence you like, but if the attacker’s home country won't enforce the warrant, your case is toast.

As the Internet matures, however, countries that don't help ferret out the Internet's biggest criminals will be penalized. They may be placed on a blacklist. In fact, some already are. For example, many companies and websites reject all traffic originating from China, whether it's legitimate or not. Once we can identify criminals and their home countries beyond repudiation, as outlined above, those home countries will be forced to respond or suffer penalties.

The heyday of the spammers where most of their crap reached your inbox is already over. Pervasive identities and international law changes will close the coffin lid on spam -- and the security tech necessary to combat it.

Doomed security technology No. 8: Anti-DoS protections

Thankfully, the same pervasive identity protections mentioned above will be the death knell for denial-of-service (DoS) attacks and the technologies that have arisen to quell them.

These days, anyone can launch free Internet tools to overwhelm websites with billions of packets. Most operating systems have built-in anti-DoS attack protections, and more than a dozen vendors can protect your websites even when being hit by extraordinary amounts of bogus traffic. But the loss of pervasive anonymity will stop all malicious senders of DoS traffic. Once we can identify them, we can arrest them.

Think of it this way: Back in the 1920s there were a lot of rich and famous bank robbers. Banks finally beefed up their protection, and cops got better at identifying and arresting them. Robbers still hit banks, but they rarely get rich, and they almost always get caught, especially when they persist in robbing more banks. The same will happen to DoS senders. As soon as we can quickly identify them, the sooner they will disappear as the bothersome elements of society that they are.

Doomed security technology No. 9: Huge event logs

Computer security event monitoring and alerting is difficult. Every computer is easily capable of generating tens of thousands of events on its own each day. Collect them to a centralized logging database and pretty soon you're talking petabytes of needed storage. Today's event log management systems are often lauded for the vast size of their disk storage arrays.

The only problem: This sort of event logging doesn't work. When nearly every collected event packet is worthless and goes unread, and the cumulative effect of all the worthless unread events is a huge storage cost, something has to give. Soon enough admins will require application and operating system vendors to give them more signal and less noise, by passing along useful events without the mundane log clutter. In other words, event log vendors will soon be bragging about how little space they take rather than how much.

Doomed security technology No. 10: Anonymity tools (not to mention anonymity and privacy)

Lastly, any mistaken vestige of anonymity and privacy will be completely wiped away. We already really don't have it. The best book I can recommend on the subject is Bruce Schneier's "Data and Goliath." A quick read will scare you to death if you didn't already realize how little privacy and anonymity you truly have.

Even hackers who think that hiding on Tor and other "darknets" give them some semblance of anonymity must understand how quickly the cops are arresting people doing bad things on those networks. Anonymous kingpin after anonymous kingpin ends up being arrested, identified in court, and serving real jail sentences with real jail numbers attached to their real identity.

The truth is, anonymity tools don't work. Many companies, and certainly law enforcement, already know who you are. The only difference is that, in the future, everyone will know the score and stop pretending they are staying hidden and anonymous online.

I would love for a consumer's bill of rights guaranteeing privacy to be created and passed, but past experience teaches me that too many citizens are more than willing to give up their right to privacy in return for supposed protection. How do I know? Because it's already the standard everywhere but the Internet. You can bet the Internet is next.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Tuesday 18 August 2015

Windows 10's usage share ascent routs Windows 7's

Early data shows Microsoft's new OS has come out of the gate faster than the venerable Windows 7 did six years ago

Microsoft's Windows 10 has easily surpassed the global usage share ascension of Windows 7 six years ago, putting it on track to become the company's most successful OS introduction ever.

Windows 10's usage share neared 6.6% on Sunday, data from analytics vendor StatCounter showed. That was a 23% increase over what the OS logged the Sunday prior, continuing the unbroken stretch of double-digit-or-more, week-over-week increases since Windows 10's July 29 debut.
Computerworld's Best Places to Work in IT 2015: Company Listings

The complete listings: Computerworld's 100 Best Places to Work in IT for 2015

A compact list of the 56 large, 18 midsize and 26 small organizations that ranked as Computerworld's

The operating system's usage share increase has comfortably exceeded that of its closest rival, Windows 7, during the latter's first 18 days of availability in the fall of 2009. Windows 7's high-water mark during that stretch was 3.8% by StatCounter's count.

StatCounter estimates what Computerworld has dubbed "usage share" by tallying page views, making the Irish metric firm's numbers a signal of activity on the Internet rather than of users.

Windows 7's uptake as measured by StatCounter was quicker off the mark than was Windows 10's, a fact that should not come as a surprise, since the former was released as a paid upgrade that users had been eagerly buying in the run-up to its Oct. 22, 2009, official launch.

Microsoft treated Windows 10 differently, choosing to not only give away the upgrade to hundreds of millions of Windows 7 and Windows 8.1 users, but delivering that upgrade to customer subsets over time, part of an effort to minimize stress on the Redmond, Wash. company's content distribution network and allow it to track problems, then presumably fix them, before triggering upgrade notifications for the next "wave."

By the fourth day after its introduction, Windows 10's usage share had topped that of Windows 7 at the same point in its post-debut schedule. Windows 10 has not relinquished the lead since then.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Wednesday 12 August 2015

Sorriest technology companies of 2015

A rundown of the year in apologies from tech vendors and those whose businesses rely heavily on tech.

Sorry situation
Despite all the technology advances that have rolled out this year, it’s also been a sorry state of affairs among leading network and computing vendors, along with businesses that rely heavily on technology. Apple, Google, airlines and more have issued tech-related mea culpas in 2015…

Sony says Sorry by saying Thanks
Network outages caused by DDoS attacks spoiled holiday fun for those who got new PlayStation 4 games and consoles, so Sony kicked off 2015 with an offer of 10% off new purchases, plus an extended free trial for some.

NSA’s backdoor apology
After getting outted by Microsoft and later Edward Snowden for allowing backdoors to be inserted into devices via a key security standard, the NSA sort of apologized. NSA Director of Research Michael Wertheimer, in writing for the Notices of the American Mathematical Society, acknowledges mistakes were made in “The Mathematics Community and the NSA.” He wrote in part: “With hindsight, NSA should have ceased supporting the Dual_EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor.”

You probably forgot about this flag controversy
China’s big WeChat messaging service apologized in January for bombarding many of its hundreds of millions of users – and not just those in the United States -- with Stars and Stripes icons whenever they typed in the words “civil rights” on Martin Luther King, Jr. Day. WeChat also took heat for not offering any sort of special icons when users typed in patriotic Chinese terms. The special flag icons were only supposed to have been seen by US users of the service.

Go Daddy crosses the line
Web site domain provider Go Daddy as usual relied on scantily clad women as well as animals to spread its message during this past winter’s Super Bowl. The surprising thing is that the animals are what got the company in hot water this time. The company previewed an ad that was supposed to parody Budweiser commercials, but its puppy mill punch line didn’t have many people laughing, so the CEO wound up apologizing and pulling the ad.

Name calling at Comcast
Comcast scrambled to make right after somehow changing the name of a customer on his bill to “(expletive… rhymes with North Pole) Brown” from his actual name, Ricardo Brown. The change took place after Brown’s wife called Comcast to discontinue cable service. The service provider told a USA Today columnist that it was investigating the matter, but in the meantime was refunding the Browns for two years of previous service.

Where to start with Google?
Google’s Department of Apologies has been busy this year: In January the company apologized when its translation services spit out anti-gay slurs in response to searches on the terms “gay” and “homosexual.” In May, Google apologized after a Maps user embedded an image of the Android mascot urinating on Apple’s logo. This summer, Google has apologized for its new Photos app mislabeling African Americans as “gorillas” and for Google Niantic Labs’ Ingress augmented reality game including the sites of former Nazi concentration camps as points of interest.

Carnegie Mellon admissions SNAFU
Carnegie Mellon University’s Computer Science School in February apologized after it mistakenly accepted 800 applicants to its grad problem, only to send out rejection notices hours later. The irony of a computer glitch leading to this problem at such a renowned computer science school was lost on no one…

Lenovo Superfish debacle
Lenovo officials apologized in February after it was discovered that Superfish adware packaged with some of its consumer notebooks was not only a pain for users but also included a serious security flaw resulting from interception of encrypted traffic. “I have a bunch of very embarrassed engineers on my staff right now,” said Lenovo CTO Peter Hortensius. “They missed this.” Lenovo worked with Microsoft and others to give users tools to rid themselves of Superfish.

Apple apologizes for tuning out customers
Apple apologized in March for an 11-hour iTunes service and App Store outage that it blamed on “an internal DNS error at Apple,” in a statement to CNBC.

Blame the iPads
American Airlines in April apologized after digital map application problems on pilot iPads delayed dozens of flights over a two-day period. The airline did stress that the problem was a third-party app, not the Apple products themselves.

Locker awakened
The creator of a strain of ransomware called Locker apologized after he “woke up” the malware, which encrypted files on infected devices and asked for money to release them. A week after the ransomware was activated, the creator apparently had a changed of heart released decryption keys needed by victims to unlock their systems.

HTC wants to be Hero
Phonemaker HTC’s CEO Cher Wang, according to the Taipei Times in June, apologized to investors in June after the company’s new One M9 flagship phone failed to boost sales. “HTC’s recent performance has let people down,” said Wang, pointing to better times ahead with the planned fall release of a new phone dubbed Hero.

Ketchup for adults only
Ketchup maker Heinz apologized in June after an outdated contest-related QR code on its bottles sent a German man to an X-rated website. Meanwhile, the website operator offered the man who complained a free year’s worth of access, which he declined.

Livid Reddit users push out interim CEO
Interim Reddit CEO Ellen Pao apologized in July (“we screwed up”) after the online news aggregation site went nuts over the sudden dismissal of an influential employee known for her work on the site’s popular Ask Me Anything section. Pao shortly afterwards resigned from her post following continued demands for her ouster by site users.

Blame the router
United Airlines apologized (“we experienced a network connectivity issue. We are working to resolve and apologize for any inconvenience.”) in July after being forced to ground its flights for two hours one morning due to a technology issue that turned out to be router-related. United has suffered a string of tech glitches since adopting Continental’s passenger management system a few years back following its acquisition of the airline.

Billion dollar apology
Top Toshiba executives resigned in July following revelations that the company had systematically padded its profits by more than $1 billion over a six-year period. “I recognize there has been the most serious damage to our brand image in our 140-year history,” said outgoing President Hisao Tanaka, who is to be succeeded by Chairman Masashi Muromachi. “We take what the committee has pointed out very seriously, and it is I and others in management who bear responsibility.”

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Saturday 1 August 2015

Top 5 factors driving domestic IT outsourcing growth

Despite insourcing efforts, the expansion of nearshore centers is not necessarily taking work away from offshore locations. Eric Simonson of the Everest Group discusses the five main drivers responsible for the rise in domestic outsourcing, why Indian providers dominate the domestic landscape and more.

IT service providers placed significant focus on staffing up their offshore delivery centers during the previous decade. However, over the past five years, outsourcing providers have revved their U.S. domestic delivery center activity, according to recent research by outsourcing consultancy and research firm Everest Group.

The American outsourcing market currently employs around 350,000 full-time professionals and is growing between three and 20 percent a year depending on function, according to Everest Group’s research.

Yet the expansion of nearshore centers is not necessarily taking work away from offshore locations in India and elsewhere. Big insourcing efforts, like the one announced by GM, remain the exception. Companies are largely sticking with their offshore locations for existing non-voice work and considering domestic options for new tasks, according to Eric Simonson, Everest Group’s managing partner for research.

We spoke to Simonson about the five main drivers for domestic outsourcing growth, the types of IT services growing stateside, why Indian providers dominate the domestic landscape, and the how providers plan to meet the growing demand for U.S. IT services skills.

Interest in domestic IT outsourcing is on the rise, but you say that that does not indicate any dissatisfaction with the offshore outsourcing model.

Simonson: This isn’t about offshore not working and companies deciding to bring the work back. That’s happening a bit with some call center and help desk functions. But, by and large, these delivery center setups are more about bringing the wisdom of global delivery into the domestic market. The fundamental goal is industrializing the onshore model vs. fixing what’s broken offshore.

Can you talk about the five main drivers behind their increased interest in locating stateside?
Simonson: The first is diversification of buyer needs. As buyers have to support new types of services, certain types of tasks may be better delivered nearshore rather than offshore.

Secondly, there may be a desire to leverage the soft skills of onshore talent. This occurs when you need someone with a certain type of domestic business knowledge or dialect or cultural affinity.

Thirdly, domestic sourcing can be a way to overcome the structural challenges associated with offshore delivery, such as high attrition and burn out in graveyard shifts.

Fourth, companies may be seeking to manage certain externalities like regulatory requirements of fears about visa availabilities. To some extent, these reasons are often not necessarily based on true requirements, but are a convenient reason to give for choosing to outsource domestically rather than the potential risks of offshore.

Finally, there may be client-specific needs that demand domestic solutions—a local bank that wants to keep jobs in the community or a company with no experience offshore looking to start the learning curve.

Within IT services, what types of work currently dominate the domestic landscape?
Simonson: Application development is most prominent, with 123 domestic delivery centers in tier-one and -two cities serving financial services, public sector, manufacturing, retail and consumer packaged goods clients. Just behind that is IT infrastructure in similar geographies focused on those verticals as well. There are 80 consulting and systems integration centers and 68 testing centers as well.

It’s interesting to note that while U.S.-based providers tend to operate larger IT service centers domestically, it’s actually the Indian providers that dominate the landscape.
security tools 1

Simonson: Traditional U.S.-based multinationals have captured more scale in individual centers and have been able to grow them, in some ways, more strategically. They’ve been able to set up shop in smaller tier-4 cities like Ann Arbor or Des Moines and have more proven local talent models.

But the majority of domestic centers are operated by India-centric providers. Part of that is driven by their desire to get closer to their customers. With application and systems integration work, the ability to work more closely with the client is increasingly valuable. And with infrastructure work, concerns about data and systems access have encouraged Indian companies to offer more onshore options.

In addition, some of the bad press they’ve received related to visa issues is encouraging them to balance out their delivery center portfolios.

But Indian providers are not necessarily staffing up their centers with American workers.
Simonson: Indian providers are more likely to use visas to bring citizens of other countries (predominantly India) into the country to work on a temporary or permanent basis in a delivery center. About 32 percent of their domestic workforce working in delivery centers is comprised of these ‘landed resources.’ Across all providers, landed resources account for six percent of domestic service delivery employees. However, tightening visa norms and higher visa rejection rates are making it more difficult for providers to rely on foreign workers.

You found that approximately 43 percent of the delivery centers are located in the South, with almost half of those concentrated in the South Atlantic. And Texas has more than fifty. Is that
simply due to the fact that it’s cheaper to operate there?

Simonson: Cheap helps. But equally important are overall population trends. The South is growing, while regions like the Northeast or Midwest are either stable or on the decline. If you look at where people are going to school or moving and where corporations are relocating their headquarters, it’s taking place from the Carolinas down through Florida and over through Arkansas, Oklahoma and Texas. Those states are also more progressive about attracting services businesses (although there are some exceptions outside of the south like North Dakota and Missouri).

Do you expect the domestic IT outsourcing market to continue to grow?
Simonson: Yes, service providers expect an increase in demand for domestic outsourcing services by new and existing customers, and plan to increase their domestic delivery capabilities by adding more full time employees to their existing centers and establishing new delivery centers. In fact, 60 percent of delivery centers are planning to add headcount over the next three years with India-centric service providers expected to lead the expansion.

Tier-2 and tier-3 cities, like Orlando, Atlanta and Rochester, are poised for the greatest growth, with tier-1 and rural centers expecting the least amount of growth.

Will the supply of domestic IT talent keep up with this increased demand?
Simonson: The pressure to find IT talent has led service providers to adopt a range of approaches to extend their reach and develop ecosystems of talent. Many have developed educational partnerships, creating formal and informal relationships with colleges and technical institutes. They’re also basing themselves in cities known for their quality of life and recruiting entry-level and experienced talent from elsewhere. It all impacts what communities they decide to work in.

All service providers will have to expand their talent pools, particularly in IT. Automation of some tasks could increase capacity, but doesn’t provide the higher-complexity skills that are most valued onshore.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com