Friday, 27 December 2013

Chromebooks' success punches Microsoft in the gut

Chromebooks' success punches Microsoft in the gut
Amazon, NPD Group trumpet sales of the bare-bones laptops in 2013 to consumers and businesses
Chromebooks had a very good year, according to retailer Amazon.com and industry analysts.

And that's bad news for Microsoft.

The pared-down laptops powered by Google's browser-based Chrome OS have surfaced this year as a threat to "Wintel," the Microsoft-Intel oligarchy that has dominated the personal-computer space for decades with Windows machines.

On Thursday, Amazon.com called out a pair of Chromebooks -- one from Samsung, the other from Acer -- as two of the three best-selling notebooks during the U.S. holiday season. The third: Asus' Transformer Book, a Windows 8.1 "2-in-1" device that transforms from a 10.1-in. tablet to a keyboard-equipped laptop.

As of late Thursday, the trio retained their lock on the top three places on Amazon's best-selling-laptop list in the order of Acer, Samsung and Asus. Another Acer Chromebook, one that sports 32GB of on-board storage space -- double the 16GB of Acer's lower-priced model -- held the No. 7 spot on the retailer's top 10.

Chromebooks' holiday success at Amazon was duplicated elsewhere during the year, according to the NPD Group, which tracked U.S. PC sales to commercial buyers such as businesses, schools, government and other organizations.

By NPD's tallies, Chromebooks accounted for 21% of all U.S. commercial notebook sales in 2013 through November, and 10% of all computers and tablets. Both shares were up massively from 2012; last year, Chromebooks accounted for an almost-invisible two-tenths of one percent of all computer and tablet sales.

Stephen Baker of NPD pointed out what others had said previously: Chromebooks have capitalized on Microsoft's stumble with Windows 8. "Tepid Windows PC sales allowed brands with a focus on alternative form factors or operating systems, like Apple and Samsung, to capture significant share of a market traditionally dominated by Windows devices," Baker said in a Monday statement.

Part of the attraction of Chromebooks is their low prices: The systems forgo high-resolution displays, rely on inexpensive graphics chipsets, include paltry amounts of RAM -- often just 2GB -- and get by with little local storage. And their operating system, Chrome OS, doesn't cost computer makers a dime.

The 11.6-in. Acer C720 Chromebook, first on Amazon's top-10 list Thursday, costs $199, while the Samsung Chromebook, at No. 2, runs $243. Amazon prices Acer's 720P Chromebook, No. 7 on the chart, at $300.

The prices were significantly lower than those for the Windows notebooks on the retailer's bestseller list. The average price of the seven Windows-powered laptops on Amazon's top 10 was $359, while the median was $349. Meanwhile, the average price of the three Chromebooks was $247 and the median was $243, representing savings of 31% and 29%, respectively.

In many ways, Chromebooks are the successors to "netbooks," the cheap, lightweight and underpowered Windows laptops that stormed into the market in 2007, peaked in 2009 as they captured about 20% of the portable PC market, then fell by the wayside in 2010 and 2011 as tablets assumed their roles and full-fledged notebooks closed in on netbook prices.

Chromebooks increasingly threaten Windows' place in the personal computer market, particularly the laptop side, whose sales dominate those of the even older desktop form factor. Stalwart Microsoft partners, including Lenovo, Hewlett-Packard and Dell, have all dipped toes into the Chromebook waters, for example.

"OEMs can't sit back and depend on Wintel anymore," said Baker in an interview earlier this month.

Microsoft has been concerned enough with Chromebooks' popularity to target the devices with attack ads in its ongoing "Scroogled" campaign, arguing that they are not legitimate laptops.

Those ads are really Microsoft's only possible response to Chromebooks, since the Redmond, Wash. company cannot do to them what it did to netbooks.

Although the first wave of netbooks were powered by Linux, Microsoft quickly shoved the open-source OS aside by extending the sales lifespan of Windows XP, then created deliberately-crippled and lower-priced "Starter" editions of Vista and Windows 7 to keep OEMs (original equipment manufacturers) on the Windows train.

But Microsoft has no browser-based OS to show Chromebook OEMs, and has no light-footprint operating system suitable for basement-priced laptops except for Windows RT, which is unsuitable for non-touch screens. And unlike Google, Microsoft can hardly afford to give away Windows.

But Microsoft's biggest problem isn't Chrome OS and the Chromebooks its ads have belittled: It's tablets. Neither Microsoft or its web of partners have found much success in that market.

Baker's data on commercial sales illustrated that better than a busload of analysts. While Windows notebooks accounted for 34% of all personal computers and tablets sold to commercial buyers in the first 11 months of 2013, that represented a 20% decline from 2012. During the same period, tablets' share climbed by one-fifth to 27%, with Apple's iPad accounting for the majority of the tablets.

"The market for personal computing devices in commercial markets continues to shift and change, said Baker. "It is no accident that we are seeing the fruits of this change in the commercial markets as business and institutional buyers exploit the flexibility inherent in the new range of choices now open to them."

But when you're at the top of the personal computing device heap -- as Microsoft was as recently as 2011 -- words like "change" and "choice" are not welcome. From the mountaintop, the only way is down.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Tuesday, 17 December 2013

Avaya builds massive Wi-Fi net for 2014 Winter Olympics

BYOD for 30,000 people creates extraordinary network demands

Ottawa, Canada -- Avaya engineers are putting the final touches on a network capable of handling up to 54Tbps of traffic when the Winter Olympics opens on Feb. 7 in the Russian city of Sochi.
wifi network

Sochi itself is a sprawling city of 350,000 people located on the Black Sea. Archeologists have found human remains in the area that date back tens of thousands of years. Today, Sochi’s subtropical climate makes it a popular Russian tourist spot.

The two locations where the Olympics will take place -- the Olympic village in Sochi and a tight cluster of Alpine venues in the nearby Krasnaya Polyana Mountains -- are completely new construction, so this project represents a greenfield environment for Avaya.

In addition to investing in a telecom infrastructure, Russia is spending billions of dollars to upgrade Sochi’s electric power grid, its transportation system and even its sewage treatment facilities. (Watch a slideshow version of the story.)

“The whole town is nothing but a constantly changing, $50 billion construction site; for instance we’ve seen the road outside our hotel be torn up at least four times. As for modern IT infrastructure? There was none to speak of. We have really had to start from scratch, right down to the laying of conduit before we could even begin installing fiber and cabling,’’ says Dean Frohwerk, Avaya’s chief network architect.

+ ALSO ON NETWORK WORLD Avaya's Wi-Fi set up in the last winter Olympics +

That’s quite a contrast from 2010, when Frohwerk and his team provided telecom and networking services for the games in Vancouver.

And this time around, the demands for bandwidth and connectivity dwarf anything that Avaya delivered in Vancouver, where the network was capable of handling only 4Tbps.

The Sochi network will serve 30,000 athletes, administrators and staff, media, IOC officials, and volunteers with data, voice, video, and full Internet access through the Games sites.

Adding to the challenge, “We expect these people to be carrying and using multiple wireless devices,” says Frohwerk. “In Vancouver, we only had to provision one device per user. This means that we really have to have the capability to support up to 120,000 users on the Sochi Wi-Fi network, without issues or interruptions.”

Plus, Avaya has to deliver 30 IPTV dedicated HD Olympic channels via its telecom backbone, and has to make these channels available to Olympic family users over the converged network. (IPTV support is an Olympic first on the network, eliminating the need for a separate CATV HFC network.)

Network upgrade

In Vancouver, Avaya installed the first all-IP converged voice, data and video network at Layer 2. “That network was laid out like a single mammoth installation, which worked well given that wired traffic outnumbered wireless four to one,” says Frohwerk.

“But we expect this equation to turn on its head at Sochi, with wireless being the four and wired traffic being the one. That’s why we have had to change our approach.”

Another lesson from Vancouver, he says, is that “requirements evolve and change during the Games, and that you have to be able to adapt the network configuration to accommodate these changes. We have also seen that ease of use is paramount: With so much going on, network operators must find it simple to make changes on the fly.”

In Sochi, Avaya’s Wi-Fi network will be split into five virtual SSID-based networks. There will be one network for the athletes, two for media (one free, one paid), one for Olympics staff, and one for dignitaries.

Each group will have its own access password, and extra layers of password protection will be added where needed. The Wi-Fi traffic will be distributed using about 2,000 802.11n access points across the Olympics Game sites; including inside the stands for the first time.

The network will be headquartered in a primary Technical Operations Center (TOC) in the coastal city of Adler, alongside the Primary Data Center. The secondary TOC and Data Center will be at the Sochi Olympic Park, located 10 miles northwest at the Games site.

Each TOC will be in a 50-foot by 70-foot control room. While one TOC is in use, the other will be kept in standby mode by a skeleton crew. Each TOC will be connected to the outside world by 10GB pipelines provided by Rostelecom, Russia’s national telecom operator.

“We have built the TOCs in separate locations to ensure redundancy in the case of a natural disaster or man-made incident,” says Frohwerk. “Should the Adler TOC go down, we would simply send the next shift to the Sochi TOC and carry on.”

The data and voice backbone is built on Avaya’s Fabric Connect, an open virtualization platform based on IEEE 802.1aq Shortest Path Bridging that enables a network fabric within/between data centers and the sites they serve.

At the core of the network are four Virtual Enterprise Network Architecture (VENA)-enabled Virtual Service Platform (VSP) 9000 switches, one in each TOC and one more in each of the mountain cluster points of presences.

Using Avaya ERS 8800 switches located at the network’s edge, the whole Sochi network will be virtualized at Layer 3 instead of Layer 2.

“Using a Layer 3 virtual software layer means that our switches can act intelligently locally, and do a better job of routing traffic,” Frohwerk says. “This reduces traffic jams, which means more uptime and better network speeds. It’s a step up from what we did in Vancouver, because the demands we’re facing are so much bigger here.”

Using Layer 3 will let Avaya’s network operators serve many more endpoints than they could in Vancouver. Each device logging in will get its own media access control address: Avaya will use pre-installed 802.1X certificates, the MAC address, or a captive portal to authenticate the device, while controlling the access level and bandwidth with the company’s Identity Engines software.

Avaya is also providing voice services and 6,500 voicemail boxes at Sochi using Avaya Aura Communication Manager (CM), Session Manager (SM), System Manager (SYMGR) and CM Messaging.

Challenges

Moving goods into Russia can be time-consuming in the best of situations. But moving massive amounts of equipment in time for the Games has been a real challenge for Avaya.

“This is why we have had people in Sochi for the past 18 months, to keep things coordinated and to make sure supplies get where they need to be,” says Frohwerk. “You can’t leave things to chance.”

Even with this level of supervision, it’s been a nerve-wracking experience for Avaya, like the time one of its equipment trucks lost radio contact for days while travelling through rural Kazakhstan.

Another truck arrived in Sochi with unprotected/uncushioned computer hardware after driving over hundreds of miles of bumpy, rough roads. “In both cases, the equipment finally arrived in usable shape,” he says. “But we had a few tense moments there for sure.”

Another challenge is training. In line with their agreement with Avaya’s Russian partners, the company is training 170 Russian technicians to provide Tier 1 and Tier 2 network support during the Games. A 30-person team from Avaya Global Support Services will provide Tier 3/4 support from Sochi’s TOC, supported by Avaya R&D staff around the world.

The training of these Russian technicians is under way and Avaya staff has rotated on site to “train” for the Olympics.

“We are doing our best to be well-prepared for whatever the Games throw at us,” Frohwerk says. Avaya’s outdoor systems are designed to handle extreme weather: “We’re not worried if it snows,” he says. “In fact, we hope it does, because these are the Winter Games, after all.”

At press time, Avaya had completed installing all of its equipment in Sochi. The company is now moving into test mode, pushing the network’s limits by putting it through multiple types of failure scenarios.

Apres ski

After the games end on Feb 23, much of Avaya’s infrastructure will be removed. But the telecom facilities it has built for the Games – including the telephone and IP networking for the Olympics skiing venue in the Caucasus, where a new resort town is being erected, will remain.

The company will also be helping to develop telecom facilities for the Grand Prix auto races that will held in Sochi later in 2014, and soccer matches there that will be part of the 2018 World Cup.

“We will be leaving behind quite a legacy telecom system when we leave Sochi,” says Frohwerk.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Thursday, 12 December 2013

10 top tests of 2013

Network World tested hundreds of products in 2013, but here are our top 10 tests of the year. In order to make the list, the product review had to be a comparative test of multiple products in a single category and it had to break new ground or deliver fresh insight into an important product area.

Here’s the list:

1. WAN OPTIMIZATION – JOEL SNYDER
We invited every major network optimization vendor, and ended up with seven contenders: Blue Coat, Cisco, Citrix, Exinda, Ipanema, Riverbed and Silver Peak.

Our Clear Choice Test winner is Riverbed, which excels at the core WAN optimization functions of compression and de-duplication. If you’re looking for innovation, you’ll be as impressed, as we were, with Ipanema Technologies ip|engines and Exinda Networks x800-series.

For great performance, we were again impressed with Silver Peak. And if you’re running all Cisco at the network edge, Cisco’s WAAS is a no-brainer with big benefits at moderate cost.
wan optimization

2. MOBILE DEVICE MANAGEMENT – DAVID STROM
We looked at six products: AirWatch, Apperian EASE, BlackBerry Enterprise Server 10 (BES10), Divide, Fixmo, and Good Technology's Good for Enterprise. Each has a somewhat different perspective and different strengths in terms of what it can control best.

AirWatch had the widest phone/tablet/desktop support. But it also requires a collection of different downloaded apps that could be confusing to use. If you’re going the secure container route, Fixmo is a strong contender.

BlackBerry should be on your short list if your primary goal is protecting your messaging infrastructure. Good Technology is a mature product that features solid email security, fast device enrollment, extensive security policies and wide device support.

Divide had the most appealing management console and overall simplest setup routine. It features the best overall approach to MDM and is the easiest to operate, but has the most limited device OS version support. Apperian does a great job with setting up a protected app portal, but falls down on some basic MDM issues.

3. MIDRANGE MANAGEMENT TOOLS – BARRY NANCE
If your network has between 1,000 and 10,000 devices and computers, you have a midsized network. Your servers, connections and other resources suffer the same problems as larger networks, but your budget for keeping the network healthy is less than what large enterprises enjoy.

We tested six products that provide a management suite for mid-range networks: Paessler PRTG v12.4, Heroix Longitude v8.1, HP Intelligent Management Center (IMC) Standard and Enterprise v5.2, Ipswitch WhatsUp Gold (WUG) v16, SolarWinds Orion Network Performance Monitor (NPM) v10.4 and Server & Application Monitor (SAM) v5.2 and Argent Software Advanced Technology (AT) v3.1, including Argent Commander 2.0 and Argent Reports 2.0.
Argent Advanced Technology earns itself the Network World Clear Choice award, edging Heroix Longitude, which came in second. Advanced Technology gave us sophisticated thresholds, a responsive user interface, accurate device discovery, time-saving root cause analysis, helpful corrective actions and meaningful reports.

4. HOSTED VDI – TOM HENDERSON
We compared hosted virtual desktop infrastructure (VDI) products from Microsoft, Citrix, and VMware and came to many conclusions, but the most important one is this: Setting up hosted desktop sessions in a BYOD world is a complex undertaking.

Our Clear Choice Test winner is Citrix's VDI-in-a-Box for its ease of integration, flexibility of both hosted operating systems and variety of clients, and its end-user experience.

VMware's Horizon View 5.2 is also very capable and can scale dramatically, but it’s more limited in both hosts (Windows) and clients served. Windows 2012 Server is good, yet requires a buy-in to Microsoft's Windows System Center Configuration Manager, and has less client flexibility.

5. PERSONAL CLOUDS – WAYNE RASH
cloud computing

A personal cloud service lets you share photos, music and documents among all your devices easily and quickly.The good news is that these cloud services are normally free for a limited amount of data. Most vendors also offer premium or enterprise versions, which allow you to store more data and to share data, which is useful in a workgroup scenario, for example.

We looked at nine personal cloud services: Apple’s iCloud, Bitcasa, Box, Dropbox, Google Drive, Microsoft SkyDrive, MediaFire, SpiderOak and Ubuntu One. While iCloud, SkyDrive and Google Drive are optimized for their respective platforms, all of the cloud services work across multiple operating systems and different browser types.

There was no single cloud service that we considered a winner. All worked as advertised, all had their strengths, as well as peculiarities or annoyances.

6. LINUX-BASED SERVER OPERATING SYSTEMS – SUSAN PERSCHKE
The five products we tested -- SUSE Enterprise Server 11 Service Pack 2, Mandriva Business Server 1.0, ClearOS 6 Professional, Red Hat Enterprise Linux 6.4 and Ubuntu 12.04 LTS -- are all enterprise server versions offering commercial support options, either at the OS level or in the form of commercial management tools and support plans.

Our Clear Choice Test winner is Ubuntu, which delivered intuitive, uncluttered management tools, excellent hypervisor support, and transparency (commercial and open source versions are one and the same).

The remaining four contenders fell into two categories with Red Hat and SUSE representing enterprise-level offerings and Mandriva and ClearOS geared more towards small and midsize businesses. In the SMB segment ClearOS edged out Mandriva.

7. TWO-FACTOR AUTHENTICATION – DAVID STROM
Relying on a simple user ID and password combination is fraught with peril. One alternative is to use one of the single sign-on solutions we reviewed last year, but there are less expensive options that could also be easier to install.

That’s where two-factor authentication services come into play. Years ago, vendors came out with hardware-based two-factor authentication: combining a password with a token that generates a one-time code. But toting around tokens means that they can get taken, and in a large enterprise, hard tokens are a pain to manage, provision and track.

Enter the soft token, which could mean using a smartphone app, SMS text message, or telephony to provide the extra authentication step. We reviewed eight services that support up to five kinds of soft tokens: Celestix's HOTPin, Microsoft's PhoneFactor, RSA's Authentication Manager, SafeNet Authentication Service, SecureAuth's IdP, Symantec Validation and ID Protection Service (VIP), TextPower's TextKey, and Vasco's Identikey Authentication Server.

8. ULTRABOOKS – WAYNE RASH
We tested eight ultrabooks, all with touchscreens and all running Windows 8 Professional. They are: the astonishingly thin Acer Aspire S7 and Asus Zenbook UX31A, the flip-screen Dell XPS 12, HP’s Envy 400t-12, Lenovo’s business oriented ThinkPad Carbon X1 and the flexible Yoga 13, the Samsung ATIV Tab 7 that transforms into a tablet, and the Sony Vaio T-15.

Our favorite, because it was the easiest to type on and the easiest to use overall was the Lenovo ThinkPad Carbon X1. This ultrabook has three ways to control the pointer, had the best keyboard by far, yet it was still thin and light.

If you need your ultrabook to convert to a tablet, then you might like the Samsung ATIV Tab 7, or the Yoga or Dell, which fold or flip to become tablets. Acer and Asus win points for being sexy, thin and stylish, so if you want to impress in the conference room, these might be for you.

9. SOFTWARE-BASED NAS – ERIC GEIER
Earlier this year we tested Network Attached Storage (NAS) appliances. Now we're reviewing software-based NAS that you can load onto your own equipment — whether it's a PC, server, virtual machine, or in the cloud. We looked at FreeNAS, Openfiler, Open-E DSS, NexentaStor, and SoftNAS. All offer some sort of free solution or service, with some being fully open sourced.

Going with a software solution enables you to select and customize the hardware it runs on to best fit your particular application and environment. For a small and simple network you could load the software on a spare consumer-level PC, or for bigger networks purchase a server or run on a virtual machine.

On the other hand, going with an appliance may be better if you aren't comfortable selecting the hardware, installing the software, and then maintaining both. Appliances are generally more plug-and-play, whereas with software solutions you have to spend some time building your own appliance.

10. OPEN SOURCE MANAGEMENT TOOLS – SUSAN PERSCHKE
We reviewed four popular open source products - Nagios Core 3.5, NetXMS 1.2.7, OpenNMS 1.10.9 and Zenoss Core 4.2. All four products are mature, have extensive monitoring capabilities similar to their enterprise-grade counterparts, and are currently updated with good community support.

Zenoss is our top pick due primarily to its intuitive and professional-grade admin interface. Also we were able to configure our environment and run reports easily, and when help was needed, we found the user guide to be an excellent resource, a rare find in the open source world.

Nagios is a good choice if a smaller footprint is desired and the infrastructure is limited in number of devices. Although NetXMS has a somewhat cluttered user interface, it boasts a rich toolset that provides a lot of granularity for infrastructure management and gets a plus for attention to mobile. OpenNMS is another powerful net management tool capable of running on most platforms and with the ability to manage a lot of data.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Tuesday, 3 December 2013

What Contract IT Workers Miss About Being Full-Time

A survey of contract and freelance IT professionals shows healthcare and 401k plans are the most-missed perks of traditional employment, while there are many things they don't miss at all.

What would you miss most about "traditional," 9-to-5, office employment? According to a survey released today, the top three most-missed benefits and perks among independent IT pros are health insurance, brainstorming with colleagues, and 401K matching.

Surprisingly (or perhaps not), not many people missed the company holiday party or formal performance reviews, according to the survey, conducted by IT staffing and Workforce-as-a-Service firm OnForce, with 19 percent ranking formal performance reviews and 17 percent ranking the holiday party among the top two and three least missed aspects, respectively, according to the survey.

The Changing Face of IT

OnForce surveyed 1,337 anonymous, independent IT service professionals throughout the United States ranging in age from 21 to over 60, and found that 37 percent of respondents missed employee-provided health insurance, 30 percent listed brainstorming with colleagues as a benefit they missed, and 20 percent answered that they missed their former employers' 401k matching plans.

IT Careers
OnForce's CEO, Peter Cannone, says the results are in line with what he sees as the changing face of the current IT workforce, and that the results of the survey have broad implications outside the IT industry and beyond the pool of independent IT professionals.

"When you consider that 50 percent of today's workforce will join the ranks of the self-employed by the year 2020, and that 60 percent of IT service professionals willingly left full-time jobs to be their own boss, it's important for companies to understand the current needs of full-time and contract employees if they want to remain competitive," Cannone says, citing research obtained by OnForce.

The survey also revealed that different age groups are concerned with different benefits and perks. Of the 37 percent of respondents that ranked health insurance as the most missed benefit, most fell into the 40-49 year age bracket, while 10 percent are aged 50-59, 9 percent are aged 30-39, 3 percent are aged 21-29 and over 60 years of age.

While health insurance and 401K plans are still a top priority among the workforce, the survey also showed that social and professional development in the workplace are extremely important, especially to those that fall between ages 40 and 49. Yet, according to the survey, that same demographic rated the company holiday party and formal performance reviews one of the least-missed aspects of 'traditional' work life.

But there's one aspect of traditional, full-time employment that remains almost universally reviled, regardless of demographic: commuting; 38 percent of all respondents, regardless of age group, cited this as their least-missed aspect of 'traditional' employment. You can see more about the survey here.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Monday, 25 November 2013

'Team Moscow' wins $100K in PayPal's Battle Hack 2013

PayPal's contest challenged teams from around the world to create best app to help community

A group of Russian software developers dubbed "Team Moscow" has won PayPal’s $100,000 Battle Hack 2013 awarded for the best socially worthy use of PayPal's API. A team of Israel finished second and one from Miami finished third.

Team Moscow produced a “Donate Now” application that leverages Bluetooth Low Energy technology to allow anyone to instantly donate to a cause right from their mobile device without filling in lengthy forms. Team Tel Aviv had an app that connects runners to encourage running, and Team Miami had LoanPal, a peer-to-peer lending service for “underbanked individuals.”

Background: PayPal’s Battle Hack competition wants cool social apps

Team Moscow members include Sergey Pronin, Alexander Balabna, Bayram Annakov, and Oksana Tretiakova, who are sharing the $100,000 prize. Pronin responded to questions via e-mail:

What kind of background do you have in application development?
I have bachelor’s degree in Software Engineering from National Research University Higher School of Economics (HSE). My teammates and I work for a Russian software company called Empatika, where I’m a senior developer working primarily on an app called App in the Air. I love programming and developing applications, and all of us enjoy participating in the internal hackathons our company hosts. For example, at the company’s last hackathon a few months ago we worked on an Arduino for the first time.

I use Objective-C and Python on daily basis, and Java and Web from time to time. I'm also currently in the second year of a Master's degree program in software engineering.

What does your winning PayPal application do?
The project consists of two main parts: the beacon, an Arduino-based BLE beacon with a Rainbowduino screen, and the client app. The idea is to help people to donate by simplifying the actual donation process and make donations more contextual. For our presentation we used "Food for Homeless" as an example, a bus that serves food for homeless people. Typically, the only way to donate is to fill in 23 fields in a form on their website, a problem that is even worse when you are on a smartphone. If you are on foot and see the bus you don't have much time to fill all the forms, our app would address this problem.

Any other comments about your experience in the PayPal contest are welcome.

It is not our first hackathon, but I can surely say that it was the best experience. PayPal has done a really great job — the environment and facilities can't be beat.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Wednesday, 20 November 2013

10 mistakes companies make after a data breach

10 mistakes companies make after a data breach
Michael Bruemmer, vice president of Experian Data Breach Resolution, outlines some the common mistakes his firm has seen as organizations deal with the aftermath of a breach during a presentation for The International Association of Privacy Professionals (IAPP) Privacy Academy.

How to weather the storm
The aftermath of a data breach, such as the one experienced last month by Adobe, can be chaotic if not dealt with properly. The result of such poor handling could see organizations facing a hit to reputation, or worse, financial and legal problems.

No external agencies secured
Sometimes a breach is too big to deal with in-house, and the type of breach may make that option an unwise one. So it's best to have external help available if needed. Incident Response teams, such as those offered by Verizon Business, Experian, Trustwave, or IBM (just to name a few), should at least be evaluated and considered when forming a business continuity / incident response plan.

"The process of selecting the right partner can take time as there are different levels of service and various solutions to consider...Not having a forensic expert or resolution agency already identified

No engagement with outside counsel
"Enlisting an outside attorney is highly recommended," Bruemmer said.

"No single federal law or regulation governs the security of all types of sensitive personal information. As a result, determining which federal law, regulation or guidance is applicable depends, in part, on the entity or sector that collected the information and the type of information collected and regulated."

So unless internal resources are knowledgeable with all current laws and legislations, then external legal counsel with expertise in data breaches is a wise investment.

No single decision maker
"While there are several parties within an organization that should be on a data breach response team, every team needs a leader," Bruemmer said.

There needs to be one person who will drive the response plan, and act as the single source of contact to all external parties. They'll also be in charge of controlling the internal reporting structure – in order to ensure that everyone from executives and individual response team members are kept updated.

Lack of clear communication
Related to the lack of a single decision maker, a lack of clear communication is also a problem. Miscommunication can be the key driver to mishandling a data breach, Bruemmer said, as it delays process and adds confusion.

"Once the incident response team is identified, identify clear delegation of authority, and then provide attorneys and [external parties] with one main contact."

No communications plan
Sticking to the communications theme, another issue organizations face is the lack of planning as it relates to the public, especially the media.

"Companies should have a well-documented and tested communications plan in the event of a breach, which includes draft statements and other materials to activate quickly. Failure to ingrate communications into overall planning typically means delayed responses to media and likely more critical coverage," Bruemmer explained.

Waiting for perfect information before acting
Dealing with the aftermath of a data breach often requires operating with incomplete or rapidly changing information, due to new information learned by internal or external security forensics teams.

"Companies need to begin the process of managing a breach once an intrusion is confirmed and start the process of managing the incident early. Waiting for perfect information could ultimately lead to condensed timeframes that make it difficult to meet all of the many notification and other requirements," Bruemmer said.

Micromanaging the Breach
"Breach resolution requires team support, and often companies fail when micromanaging occurs. Trust your outside counsel and breach resolution vendors, and hold them accountable to execute the incident response plan," Bruemmer said.

No remediation plans post incident
There should be plans in place that address how to engage with customers and other audiences once the breach is resolved, as well as the establishment of additional measures to prevent future incidents.

"If an organization makes additional investments in processes, people and technology to more effective secure the data, finding ways to share those efforts with stakeholders can help rebuild reputation and trust. Yet, many fail to take advantage of this longer-term need once the initial shock of the incident is over," Bruemmer said.

Not providing a remedy to consumers
Customers should be put at the center of decision making following a breach. This focus means providing some sort of remedy, including call centers where consumers can voice their concerns and credit monitoring if financial, health or other highly sensitive information is lost.

"Even in incidents that involve less sensitive information, companies should consider other actions or guidance that can be provided to consumers to protect themselves," Bruemmer said.

Failing to practice
"Above all, a plan needs to be practiced with the full team. An incident response plan is a living, breathing document that needs to be continually updated and revised. By conducting a tabletop exercise on a regular basis, teams can work out any hiccups before it's too late," Bruemmer said.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

12 hot security start-ups to watch

These start-ups are focusing on security in cloud services and mobile devices

Going into 2014, a whirlwind of security start-ups are looking to have an impact on the enterprise world. Most of these new ventures are focused on securing data in the cloud and on mobile devices. Santa Clara, Calif.-based Illumio, for example, founded earlier this year, is only hinting about what it will be doing in cloud security. But already it's the darling of Silicon Valley investors, pulling in over $42 million from backer Andreessen Horowitz, General Catalyst, Formation 8 and others.

The cloud’s lure is easy to see. More businesses continue to adopt a wide range of cloud services -- whether software-as-service, infrastructure-as-a-service or platform-as-a-service. That means the enterprise IT department needs more visibility, monitoring and security controls for what employees are doing and evidence their data is safe. In addition, employees today increasingly use smartphones and tablets they personally own for work in “Bring Your Own Device” mode, leading to other management and security questions. When there are perceived security “gaps,” start-ups see opportunities, as the 12 firms we identify here do.

Security is increasingly delivered not as on premises software or hardware but at least partly if not wholly as a cloud-based service. Gartner is predicting security-as-a-service will grow from about $2.13 billion now to $3.17 billion in 2015.

Gartner: Cloud-based security as a service set to take off

With all of that in mind, here’s our slate of security start-ups worth watching in the near future:

Adallom is based in Menlo Park, Calif., but has its research and development roots in Israel, where its three co-founders, Assaf Rappaport, vice president of R&D Roy Reznik and CTO Ami Luttwak have backgrounds in the Israel cyber-defense forces. Adallom — a word which means “last line of defense” in Hebrew — is taking on the problem in monitoring user actions related to software-as-a-service (SaaS) usage. The firm’s proxy-based technology announced this month is offered to the enterprise either as a security service in the cloud or server-based software for on premises.

The goal is to provide real-time analysis and a clear audit trail and reporting related to SaaS-based application usage by the enterprise. The monitoring can allows options for automating or manually terminating sessions or blocking content download. Though not wholly similar, its closest competitors could be considered to be two other start-ups, SkyHigh Networks and Netskope. The venture has gotten $4.5 million in funding from Sequoia Capital.

AlephCloud hasn’t yet made its software and service called AlephCloud Content Canopy generally available, but its purpose is to provide controlled encryption and decryption of documents transmitted business-to-business via cloud-based file synchronization and sharing services such as Dropbox, SkyDrive and Amazon S3. The company was founded in 2011by CEO Jieming Zhu and CTO Roy D’Souza. Zhu says Content Canopy works by means of the “federated key management” process AlephCloud developed that can use existing enterprise public-key infrastructures used in identity management. For the end user, though, who is permitted to retrieve and decrypt the encrypted document via Dropbox or SkyDrive, it’s all transparent. AlephCloud says its “zero-knowledge” encryption process means the company never holds the private encryption key. AlephCloud will first be supporting PCs, Macs, and Apple iOS devices, and Android next year, and specific file-sharing services. Zhu says the underlying technology can be expanded further to other applications as well. AlephCloud has received $9.5 million in venture-capital funding, including $7.5 million from Handbag LLC and the remainder from angel investors.

BitSight Technologies has a simple proposition. It’s not uncommon for companies to want to try and evaluate the IT security of another business before entering into an e-commerce arrangement where networks may be interconnected in some way. BitSight, co-founded in 2011 by CTO Stephen Boyer and COO Nagarjuna Venna, has a security “rating” service to do this, though there are limits on how far it can go at this point. The BitSight approach, says vice president of marketing Sonali Shah, relies on an analysis of Internet traffic by BitSight sensors on the Internet to detect if the company’s IT assets, such as computers, server or network, have been commandeered by threats such as botnets or denial-of-service attacks. But she acknowledges there’s not yet a way for BitSight to determine what security issues might arise in a company’s use of cloud services. Cambridge, Mass.-based BitSight has received $24 million in venture-capital funding from investors that include Menlo Ventures, Globespan Capital Partners, Commonwealth Capital and Flybridge Capital partners.

Defense.net is focusing on stopping denial-of-service attacks aimed by attackers at both enterprises and cloud service providers. Founded by its CTO Barrett Lyon, who started another anti-distributed denial-of-service firm called Prolexic in 2003, Defense.net relies on a cloud service without the need for an appliance to mitigate against large-scale DDoS assaults. Many in the industry say DDoS attacks are growing worse in scale and number. For his part, Lyon says he thinks the average DDoS attack is probably 16 times larger and “significantly more sophisticated than it was a year earlier.” Defense.net has received $9.5 million in funding from Bessemer Venture Partners.

Illumio, founded by its CEO Andrew Rubin earlier this year, is still in stealth mode, maintaining a discrete silence about its intentions. But the little hints sprinkled across its website indicate the Santa Clara, Calif.-based company’s focus is likely to be tackling cloud-based security with an emphasis on virtualization. Illumio has brought in former VMware techies and execs. As for Rubin himself, he was formerly CEO at Cymtec Systems, a security firm providing the means for visibility, protection and control by the enterprise of Web content and mobile devices, plus a means for intrusion-detection analysis. Illumio has received more than $42 million in funding from Andreessen Horowitz, General Catalyst, Formation 8 and others.

Lacoon Mobile Security has come up with a sandboxing approach to detect zero-day malware targeting Android and Apple iOS devices by means of a small lightweight agent that examines mobile applications through behavior analysis and a process tied to the Lacoon cloud gateway. The start-up was founded by CEO Michael Shaulov, vice president of research and development Ohad Bobrov, and Emanuel Avner, the CFO. The company has its R&D arm in Israel and its headquarters in San Francisco. It’s backed by $8 million in venture-capital funding led by Index Ventures, plus $2.7 million in angel investing, including from Shlomo Kramer, CEO at Imperva.

Malcovery Security, based in Pittsburgh, was basically spun out in 2012 from research on phishing done at the University of Alabama in Birmingham, according to its CTO Greg Coticchia. Targeted phishing attacks can have disastrous outcomes when devices are targeted to infiltrate organizations and steal data. Coticchia says the Malcovery technologies offered to businesses include ways to identify phishing websites and a service that can detect phishing e-mail. The company’s founders include Gary Warner, director of research in cyber forensics at the University of Alabama, and the start-up has received about $3 million in funding from the university.

Netskope wants to help businesses monitor how their employees are using cloud-based applications and apply security controls to it, such as giving IT managers the ability to block data transfers or receive alerts. The Netskope service can apply security controls to about 3,000 different cloud-based applications, whether they be SaaS, PaaS or Iaas. The Netskope service is meant to let IT divisions get a grip on cloud usage and avoid the “shadow IT” issue of business people initiating cloud services without informing IT at all. The Los Altos, Calif.-based start-up was founded in 2012 by CEO Sanjay Beri along with chief architect Ravi Ithal, chief scientist Krishna Narayanaswami, and Lebin Chang, head of application engineering teams, all who bring tech industry experience ranging from Juniper to Palo Alto Networks to VMware. Netskope has amassed $21 million in venture funding from Social+Capital Partnership and Lightspeed Venture Partners.

PrivateCore is a crypto-based security play, focusing on making use of the central processing unit (CPU) as the trusted component to encrypt data in use. PrivateCore has come up with what it calls its vCage software that relies on the Intel Xeon Sandy Bridge CPU for secure processing through means of Intel Sandy Bridge-based servers in cloud environments, first off in IaaS. The challenge in processing encrypted data is “the problem with having to decrypt to do processing,” says Oded Horovitz, CEO of the Palo Alto, Calif.-based start-up he co-founded with Steve Weis, CTO, and Cal Waldspurger as adviser. The vCage approach, based on Intel CPU Sandy Bridge, makes use of the Intel Trusted Execution Technologies and Advanced Encryption Standard algorithm to perform the processing in RAM. This can be done with Intel Sandy Bridge because there’s now about 20MB of cache available, he points out, enough to get the job done. The data in question is only unencrypted in the CPU. This encryption approach is being tested now by IaaS providers and some enterprises, and PrivateCore expects to have its first product in general release early next year. The start-up has received $2.4 million in venture capital from Foundation Capital.

Skycure is all about mobile-device security, with its initial focus on Apple iOS iPhones and iPads. It recently introduced what’s described as an intrusion-detection and prevention package for mobile devices, which Skycure’s co-founder and CTO Yair Amit says relies on the Skycure cloud service for security purposes. He says the goal is to prevent and mitigate any impact from attackers exploiting configuration profiles on mobile devices. Skycure, based in Tel Aviv, Israel, was co-founded by CEO Adi Sharabani and the company has received about $3 million in venture-capital funding from Pitango Venture Capital and angel investors.

Synack was founded by two former National Security Agency (NSA) computer network operations analysts, CEO Jay Kaplan and CTO Mark Kuhr. According to them, the Menlo Park, Calif.-based start-up is bringing together security experts with expertise in finding zero-day bugs in software, particularly in websites and applications of Synack customers. “We pay researchers for vulnerabilities found,” explained Kaplan last August as Synack officially debuted. He says bug bounty rates typically run a minimum of $500 to several thousand for serious vulnerabilities in databases, for example. Synack says it has cultivated relationships with several bug hunters around the world, including at the NSA, who would be available to take on specific assignments. Synack has received $1.5 million in venture-capital funding from a combination of investors that include Kleiner Perkins Caufield & Byers, Greylock Partners, Wing Venture Partners, Allegis Capital and Derek Smith, CEO of start-up Shape Security.

Threat Stack, founded by CEO Dustin Webber with Jennifer Andre, wants to give enterprises a way to know if hackers are breaking into Linux-based servers that they may use in their cloud services. To monitor for hacker activity, the start-up’s Cloud Sight agent software for Linux needs to be installed on the Linux server under administrative control in the cloud environment, says Webber. “We look for the behavior of the hacker,” he points out, noting the enterprise will get an alert if a hacker break-in is underway and a measure of forensics about incidents can be obtained if needed. Cloud Sight could also be potentially used by cloud service providers as well but the initial focus is on monitoring for the enterprise, he says. Threat Stack, founded in Cambridge, Mass., in 2012, has obtained $1.2 million in funding from Atlas Venture and .406 Ventures. The start-up is yet another example of why there’s new energy directed toward finding ways to provide visibility, monitoring and security for businesses adopting cloud services.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Wednesday, 13 November 2013

Microsoft: Fuel-cell powered data centers cost less, improve reliability

Microsoft researchers say fuel cell-based data centers could go where no data centers have gone before

Data centers powered by fuel cells, not the public power grid, could cut both capital and operational costs, improve reliability, pollute less and take up less space, according to Microsoft researchers.

This technology could make data center expansion possible in regions where utility-supplied power is tapped out but natural gas is abundant, according to a paper posted by Microsoft Research. Also, since the reliability of gas supply is better than that of electrical power, these data centers would suffer less downtime.

The researchers say there are many variables that need to be taken into account in engineering these facilities, but overall they hold potential for greener data centers.

The researchers looked at distributing relatively small fuel cells – similar to those used on propane-powered buses - around data centers to power a rack or two of servers each and found several potential benefits. It eliminates the need for the wired electrical distribution system in a traditional data center. If a fuel cell were to fail it would affect a limited number of servers, which data center management software could handle. Since the power is DC, the AC to DC converters in the servers could be eliminated.

In that configuration the power supply would be nearby each rack so there would be no need for a data center-wide electricity distribution system with its attendant transformers, high-voltage switching gear and distribution cabling. Pipes to distribute natural gas and leak sensors cost less. The tradeoff in the amount of space the gear occupies means a 30% reduction in the required square footage for a data center as a whole, the researchers say.

The fuel cells emit 49% less carbon dioxide, 68% less carbon monoxide emissions and 91% less nitrogen oxide, than traditional power methods, the researchers say.

Fuel cells do require specialized equipment that is not needed in traditional data centers such as reformers that pull hydrogen from methane, batteries and startup systems and auxiliary circuits.

Design of a fuel cell powered data center would have to take into account the spikes in server usage that require instantaneous power supply increases. Fuel cells, which perform best under constant load, can lag seconds behind changes in demand. “Some of the spikes can be absorbed by the server power supply with its internal capacitors. But large changes like flash crowd and hardware failures must be handled by an external energy storage (batteries or super-caps) or load banks,” the paper says.

The researchers figured use of rack-level power cells at $3-$5 per Watt, and planned for a five-year replacement cycle for them. The entire system life was set at 10 years. They eliminated the cost of diesel generators and uninterruptible power supplies because the natural gas supply is so reliable. Distribution at the rack level eliminates the need for transformers, high voltage switching gear and distribution cabling. Pipes to distribute natural gas and leak sensors cost less. The tradeoff in the amount of space the gear occupies means a 30% reduction in the required square footage for a data center as a whole.

This issue could be addressed by installing server-sized batteries that could jump in with extra power when server hardware is starting up or shutting down, the times of greatest change in power draw. Fuel cells give off heat, so these data centers would need greater fan capacity to cool them.

The capital cost of a traditional data center is $313.43 per rack per month. A rack-level fuel cell data center is between $50.72 and $63.36 less than that, researchers say. Operating expenses per rack per month for a traditional data center are $223.51 vs $214.06 for one powered by polymer electrolyte membrane fuel cells. The savings would be greater with a different type of technology called a solid oxide fuel cell. Savings would also fluctuate depending on the price of electricity where the data center is located.

Reliability of natural gas distribution systems is better than that of the electrical grid, and that would on average cut annual downtime from 8 hours, 45 minutes to 2 hours, 6 minutes, the researchers say.

The researchers considered using large fuel cells to plug into a traditional data center design as a direct replacement for a utility-provided electric service, but they decided that the larger the fuel cell the greater the chance of failure. Plus the cost was high.

They also considered tiny fuel cells to power individual servers. A failure would affect just one server, and because the cell is integrated there is no DC transmission loss. However, lots of tiny cells may add up to a less efficient and less cost effective use of energy than the slightly larger ones needed for racks.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com




Friday, 8 November 2013

What Google killing IE9 support means for software development

Google's announcement that it won't support Internet Explorer 9 is a sign of a broader move toward rapid iteration in software development.

Let's start with this: I am completely OK with this. Sure, it may be a bit of a drag for people running Windows Vista (as newer versions of Internet Explorer – 10 and 11 – require Windows 7 or 8) but, let's be honest - nobody expects any company to spend the money and man-hours supporting every web browser for all eternity. And Vista users still have the option of installing another web browser, such as Firefox or Chrome.

So, if this isn't all that big of a deal, why am I bringing it up?

Web browsers are, in essence, platforms for running software.
Internet Explorer 9 was released in 2011. It’s only two years old.

That means that we have reached the point where complete application platforms are being deprecated, and left unsupported, after having existed for only two years. And, while that does bode well for the rapid improvement of platforms, it comes with a pretty steep price.

The most obvious of which is that end users are put in the position of needing to upgrade their systems far more often. This costs a not-insignificant amount of time (especially in larger organizations) and money. It is, to put it simply, inconvenient.

This rapid iteration of new versions of these systems also takes a heavy toll on software development. More versions of more platforms means more complexity in development and testing. This leads to longer, and more costly, development cycles (and significantly higher support costs). The result? The software that runs on these systems is improved at a slower rate than would otherwise be possible, and in all likelihood they will be of lower quality.

These are some pretty major drawbacks to the current “Operating Systems and Web Browsers are updated every time the wind changes direction” situation. But is it really all that bad? The alternative, for Windows users, isn't terribly attractive. Nobody wanted to be stuck with IE 6 for a second longer than was absolutely necessary.

I don't have a solution to any of this, mind you. Not a good one, at any rate – maybe we should make a gentleman's agreement to not release new Operating Systems or Browsers more often than every three years. (See? Not a good solution.)

I'm just not a big fan of how it's currently working.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Wednesday, 23 October 2013

Microsoft's Graveyard: 16 products that Microsoft has killed

Microsoft's Graveyard: 16 products that Microsoft has killed
Some were killed off, others folded into new products. Either way, no product lives forever.

Every product has its end. It is either replaced, upgraded or merged in with something else. Even Microsoft, a company that is notoriously generous and patient with letting a product gain momentum, is willing to pull the plug when necessary.

Here are some of the most notable Microsoft products that have met their demise

TechNet
This was probably the biggest product to go to the graveyard in 2013. Microsoft announced the end for TechNet due to rampant abuse and piracy. The company started TechNet in 1998 to sell IT professionals perpetual licenses to Windows client and server operating systems. People abused the system for years before Microsoft had enough. Users are now being migrated to the MSDN network.

Live Products
Microsoft did a lot of consolidation this year, and its Live products got folded into a lot of other programs. Live Mail and Hotmail were folded into Outlook.com, Live Mesh was sunset in favor of SkyDrive, and Live Messenger was axed at the beginning of the year with existing accounts being transferred to Skype.

Surface Pro
It came out in February and was gone by October. But with good reason. The Surface Pro 2 tablet is a huge improvement over the original Surface Pro, with the company claiming it has up to 75% better battery life and 20% better performance than the original. Now they just need to sell some.

Windows Small Business Server
With the release of Windows Server 2012, Microsoft announced it would no longer release a small business version of the OS. The company is encouraging small business owners to take their needs to Microsoft's hosted cloud solutions instead. So you can either move to Azure or deploy Server 2012, Exchange Server and Sharepoint. Which would you prefer?

Encarta
Microsoft first delivered Encarta on CD-ROM in 1993 as part of the early wave of multimedia products for PCs, before adding a website as well. In response to criticism against Wikipedia's dubious veracity, Microsoft sought credibility by acquiring other encyclopedias, including Collier's Encyclopedia and New Merit Scholar's Encyclopedia. The company had tried to buy Encyclopedia Britannica but was rebuffed.

Encarta just could not keep up with Wikipedia and fell totally behind. User changes and updates were enabled in 2006, but only after Encarta staff approved them. The result? Encarta Premium, the high-end product, boasted 62,000 articles compared to Wikipedia's 1 million-plus. In March 2009, Microsoft announced it was discontinuing both the Encarta disc and online versions.

Flight Simulator
This upset a lot of people because of how it was handled. Microsoft Flight Simulator was one of the company's oldest products, first hitting the market in 1978 from game publisher subLOGIC before Microsoft acquired the company in 1982.

Flight Simulator had an extremely loyal fanbase and a huge mod/add-on market. These folks were really upset when Microsoft just killed the game, rather than trying to find a buyer to keep it going. But with the economic downturn in 2008, Microsoft started looking at its assets, and in early 2009, the games division took a big hit, with FlightSim being one of them.

Zune
Zune was a me-too product from Microsoft that came way too late. Normally, being late to market is not a hindrance for Microsoft. It's frequently late to market, and that hadn't been a problem before. With the Zune, Microsoft had a few interesting ideas, like sharing songs with other Zunes, but Zune had no chance against the iPod. Microsoft introduced it in 2007 and killed it in 2011, but parts of Zune live on. The software player is used in Xbox Live and Windows Phone 8.

Kin
Kin was barely born, as Microsoft killed the product literally weeks after launch. The Kin phones were ugly little things meant to be low-cost PCs aimed at the younger market, people who might not be able to afford a smartphone. Engadget did a good post-mortem on the whole deal, detailing how a complete OS rewrite and a focus on higher prices did in the Kin. Microsoft would put its efforts behind Windows Phone.

Windows Home Server
Bill Gates introduced this new home product at the 2007 Consumer Electronics Show, and it shipped that year. Based on Windows Server 2003 R2, it was meant for homes or small offices with multiple connected PCs, offering file sharing, automated backups, print server, and remote access. However, there was no real push from Microsoft or the OEMs. Microsoft would only sell it through OEMs. You couldn't just download it and install it on an old PC, which is what was so helpful to Linux in its early days. With such a middling effort, it went nowhere and was killed off last year.

Microsoft Works
Inspired by AppleWorks, a nifty little suite that originally shipped on the Apple II computer (I owned a copy, too), Microsoft shipped its first version, Works for DOS, in 1987. At the time, it was one giant app. Your word processor, spreadsheet and database all ran from the same application, just like AppleWorks. Microsoft would modernize it and usually offer it as part of a software bundle with new PCs for years. Finally, in 2009, Microsoft ended the project, replacing it with Office 2010 Starter Edition.

FrontPage
Originally developed by Vermeer Technologies, Microsoft acquired this rapid HTML development tool in 1996 and made it a part of Windows NT Server, which included the Internet Information Server web server software, and eventually the Office suite. FrontPage and IIS were very proprietary and really locked code into Microsoft products. Front- and back-end software did not port easily, and Microsoft was criticized for that. As IIS and FrontPage matured, Microsoft moved away from the vendor lock.

In 2006, Microsoft announced that FrontPage would eventually be replaced by two far more advanced web development products: SharePoint Designer, for business professionals to design SharePoint-based applications, and Expression Web targeted at the web design professional for the creation of feature-rich websites. Microsoft discontinued Microsoft FrontPage that year.

Microsoft Expression
This one didn't last long. Six years after its launch, Microsoft announced that Expression Studio would no longer be a standalone product. Expression Blend was integrated into Visual Studio, while Expression Web and Expression Design are now available as free products, although it got no technical support, and Microsoft doesn't plan to release new versions of Expression Web or Design.

Microsoft Money
Microsoft didn't conquer every market it targeted. One area it could never crack was home finance. Intuit, maker of Quicken, has ruled that roost for decades. Microsoft tried to acquire the firm but was met with significant government resistance. So it tried competing with Quicken, with no luck. From 1991 to 2009, Microsoft spun its wheels with Money, with very low market share to show for its efforts.

IronRuby
Open source supporters were cautiously optimistic that Microsoft might be embracing open source religion a decade ago with things like Port 25 and projects like IronPython and IronRuby. Well, scratch the last one. No official announcement was made, but word started leaking out when a former employee who worked on it discussed in blogs posts that no one was left working on the project.

IronRuby died from abandonment, and there is some skepticism that Microsoft is making any real effort with it. IronRuby is maintained by volunteers, and its revisions have been very slow and minor in recent years.

Windows Live OneCare
Microsoft's first attempt at a security suite, OneCare was based on Reliable Antivirus (RAV), which Microsoft purchased from GeCAD Software Srl in 2003. The software offered disk cleanup and defragmentation, a full virus scan, backup notification, checking for updates and a firewall. However, the software took a pounding from critics and security experts, many of whom rated the AV scanner very low, near the bottom in tests, and said the firewall allowed for too many potential exceptions. And Microsoft was selling this for $59. It abandoned the software with the release of Windows 7 and introduced Microsoft Security Essentials, which does a better job overall at malware detection.

Xbox One DRM
This was almost Microsoft's suicide. Microsoft initially proposed DRM for the Xbox One that included mandatory Internet connections and restricted game sharing with friends, plus a requirement that the Kinect motion detection camera be connected at all times. This was met with howls from furious gamers and promises of a boycott.

Inside of a month, Microsoft relented on everything. The result is that Xbox One vaulted to No. 1 on Amazon presales, ahead of PlayStation 4. Both consoles are expected to sell a few million units when they ship next month.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com



Friday, 11 October 2013

Infographic: Facebook vs. Twitter 2010 user stats

Another day, another pretty infographic. This one breaks down the demographic differences between Facebook and Twitter.
Facebook and Twitter are the big boys in the social networking space. So big, in fact, that we’ve probably written about them a bit too much in 2010. But hey, why stop in December? This breakdown was put together by Digital Surgeons and shows demographic statistics (and a few fun facts) for both sites. You may know that Facebook is much larger with 500 million users compared to Twitter’s 106 million, but did you know that 52 percent of Tweeters update their status every day while only 12 percent of Facebook users do the same? How about the fact that half of Twitter’s users are in college compared to only 28 percent of Facebook users. It shows just how much Facebook has changed since its days as a university-only social network. Enjoy.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Tuesday, 8 October 2013

Google relaxes access controls to Apps docs

Google relaxes access controls to Apps docs
People without a Google Account will be able to view documents stored in the Apps suite

Adding convenience possibly at the expense of security, Google will now let people without a Google Account view documents stored in its Apps cloud suite.

The move is meant to simplify how Apps customers share files with outsiders.

Until now, Apps customers could only grant document access to users with a Google Account. People who didn't have an account or who weren't logged in to their account couldn't get into the documents even when invited to do so via an emailed link from an Apps user.

That will no longer be the case, Google said on Monday.

The change applies to Word processing files created with Docs, presentations created with Slides and charts created with Drawings, which are all Google cloud productivity apps that are included in Apps, the company's workplace collaboration and communication suite.

"As a result of this change, files shared outside your domain to an email address not linked to an existing Google Account can be viewed without having to sign in or create a new Google Account," reads the Google blog post.

These recipients will only be able to view the file. They won't be able to edit or add comments to it, actions that still require the recipient to be logged into a Google Account.

Google warns that "because no sign in is required, anyone may view the file with this sharing link." In other words, the file could end up being viewed by unintended users who somehow get their hands on the link. This possibility is erased if the recipient creates a Google Account, at which point the link becomes unusable for others.

The company started to roll out the feature on Monday to Apps customers that are on the "rapid release" track, which delivers new and changed functions to administrators and end users as soon as they go live. The feature will later reach Apps customers on the "scheduled release" track, which delivers updates once a week and makes them available to administrators first.

Apps administrators will be able to disable this feature for their users on their domain control console.



Best Microsoft MCTS Certification,
Microsoft MCITP Training at certkingdom.com


Friday, 4 October 2013

Microsoft dings Ballmer's bonus over Windows 8, Surface RT struggles

The penalty is equivalent to half the cost of a cup of coffee at McDonalds to the average American

Microsoft's board of directors reduced outgoing CEO Steve Ballmer's bonus for the 2013 fiscal year, citing poor performance of Windows 8 and the $900 million Surface RT write-off, according to a filing with the U.S. Securities and Exchange Commission.
Microsoft CEO Steve Ballmer
Microsoft CEO Steve Ballmer (Photo: Microsoft)

The Redmond, Wash., company's proxy statement spelled out the salaries and bonuses of several of its top executives, including Ballmer, new Chief Financial Office Amy Hood and Chief Operating Officer Kevin Turner, as well as now-departed managers such as former CFO Peter Klein and Office chief Kurt DelBene.

Microsoft paid Ballmer $697,500 in salary and awarded him a $550,000 performance bonus, for a total of $1.26 million for fiscal year 2013.

The bonus was less than Ballmer could have earned.

"Our Board of Directors approved an Incentive Plan award of $550,000 which was 79% of Mr. Ballmer's target award," stated the proxy. One hundred percent of the target would have been $696,000.

The 79% was considerably lower than Ballmer's comparable number for the 2012 fiscal year, when he was granted a bonus representing 91% of his target.

Microsoft's board cited both company wins and losses under Ballmer's stewardship, but the latter included some failures that were the root of its bonus decision.

"While the launch of Windows 8 in October 2012 resulted in over 100 million licenses sold, the challenging PC market coupled with the significant product launch costs for Windows 8 and Surface resulted in an 18% decline in Windows Division operating income," the proxy noted. "Slower than anticipated sales of Surface RT devices and the decision to reduce prices to accelerate sales resulted in a $900 million inventory charge."

Some analysts have speculated that the $900 million write-off was the proverbial straw that broke the board's back, and triggered Ballmer's ouster. In an interview with the Wall Street Journal last week, however, John Thompson, the lead independent director and the head of the committee in charge of the search for a new chief executive, backed Ballmer's explanation for his sudden retirement: He did not want to remain in the job through the long course correction to a "devices-and-services" strategy.

The proxy statement's commentary on the strategy change, as well as the corporate reorganization announced in July, was Ballmer-neutral. "The company continued to make progress in its devices and services strategy," the filing read.

Last year, Ballmer's bonus was pegged at 91% of his target as the board ticked off several issues during that fiscal year, including a 3% decline in revenue for the Windows and Windows Live Division, and a fiasco where Microsoft failed to offer a browser choice screen to Windows 7 customers in the European Union.

Ballmer's 2013 bonus of 79% was an even lower percentage than that of Steven Sinofsky last year. Then, the former Windows chief -- who was ousted in November 2012 -- received 90% of his target award, even though he, like Ballmer, was cited as responsible for the EU browser choice screw-up.

Other top-tier executives received 100% or more of their target bonuses for 2013.

Kevin Turner, the COO, received a cash award of $2.1 million, or 100% of his target, and Satya Nadella, who now leads the Cloud and Enterprise group, received $1.6 million, or 105% of his target. Amy Hood, the new CFO, was handed $457,443, 100% of her target incentive, and as part of her promotion, received a stock award in May of 103,413 shares that will vest over the next three years. At Thursday's closing price, those shares had a paper value of $3.5 million.

In total compensation for the 2013 fiscal year, Turner remained Microsoft's highest-paid executive at $10.4 million, down slightly from 2012's $10.7 million.

Eight of the company's top executives, including Turner and Hood, were handed additional stock grants Sept. 19, the same day Microsoft announced a retention bonus designed to keep upper management from jumping ship during the CEO search. Turner, for example, received grants currently worth $20.3 million. Hood's award was valued at Thursday's closing bell at nearly $3.9 million.

No one should cry for Ballmer's lowered bonus: According to the proxy, he controls 4% of the company, with stock holdings worth $11.3 billion at Thursday's price. Only co-founder and chairman Bill Gates holds more: 4.5%, or $12.8 billion.

The $146,000 that Ballmer did not get in his 2013 bonus is literally pocket change to the billionaire. The amount represented 0.0013% of Ballmer's Microsoft holdings, and an even smaller percentage of his total wealth. To put that into perspective, 0.0013% of $42,693, the U.S. per capita personal income in 2012, is 55 cents, or just over half the price of a coffee from McDonalds "Dollar Menu."

Ballmer and Gates are both on the directors slate for re-election next month when Microsoft hosts its shareholders meeting.

According to a report by the Reuters new service earlier this week, some of Microsoft's biggest investors have urged the board to push Gates out of the chairman's role because they are concerned he will block the board from making drastic changes and handcuff the new CEO to the devices-and-services strategy, which they question. Gates is also on the special search committee tasked by the board to recommend Ballmer's replacement.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Thursday, 26 September 2013

Kindle Fire HDX tablets show big push for business users

Amazon Wednesday unveiled two Kindle Fire HDX tablets with features that indicate a clear attempt to attract business users to the platform.

The 7-in. HDX is slated to begin shipping on Oct. 18 for $229 ($329 for a 4G version) for either AT&T or Verizon Wireless. An 8.9-in. version will cost $379 but won't ship until Nov. 7, with a 4G variant priced at $479.

The biggest enterprise-centric features are contained in what Amazon calls its updated Fire OS 3.0 "Mojito" that's built on Android. An over-the-air 3.1 update of Mojito is promised for mid-November.

The business-class features include hardware and software encryption, secure Wi-Fi for access to corporate apps and SharePoint, a native VPN client and single sign-on capabilities, Amazon said.

Android enterprise and productivity apps such as GoToMeeting, Evernote, Cisco AnyConnect and Documents To Go, can be found at the Amazon AppStore.

Kindle-specific device management APIs (Application Programming Interfaces) are included so that IT workers can manage the HDX devices through Mobile Device Management software vendors like AirWatch, Citrix, Fiberlink, and Google Technology, Amazon said.

Amazon created a "Kindle Fire for Work" Web page that describes some of the new features, such as a "robust corporate e-mail experience" using Exchange email with ActiveSync "that keeps you connected to your company's Exchange server while also meeting IT's security policies."

The new hardware appears to be designed with enterprise users -- and consumers -- in mind.

For instance, there is a unique "Mayday" button that when pressed will bring live, free tech support within 15 seconds. Some early reviewers have already questioned the privacy of the Mayday function.

Amazon said the battery life offers all day use -- up to 11 hours of mixed use at a time. The 8.9-in. model is a light 13.2 ounces, or 34% lighter than the current model and the lightest large-screen tablet on the market. By comparison, the device is nearly 10 ounces lighter than the 9.7-in. Apple iPad.

Brighter, better definition displays on the devices include a 1920 x 1200 one with 323 pixels per inch in the smaller version and 2560 x 1600, or 339 PPI in the larger version. Amazon boasted that both models will have three times faster processing power than the last generation of Kindles, with 2.2 GHz quad-core Snapdragon 800 processors.

For some analysts and reviewers, it comes as a mild surprise that Amazon is pitching its tablets to workers looking to use the device at work, especially since the smaller tablets seem more suited for consuming than for productivity. Amazon seems to have anticipated such concerns, by citing a statement from ROI Training, a corporate user of previous Kindle Fire tablets, proclaiming that its use of the device has made it easier for its employees to stay productive at both work and home.

Kindle is already the second most popular tablet at work in the U.S., said Amazon's Raghu Murthi, vice president of enterprise and education, in a statement. "As employees increasingly bring their own devices to work, the new Kindle Fire tablets can easily be integrated into the workplace with the new enterprise features." Murthi said.

Microsoft this week unveiled the Surface 2 and Surface Pro 2 tablets, both with 10.6-in. displays that allow them to approach laptop capabilities when used with covers that double as keyboards. Microsoft adapted the tablet kickstand of both devices to work in two positions to enable them to be used more easily as laptops.

Analysts believe larger displays are considered better for maximum productivity, while 7-in. to 8-in. displays are generally seen as consumption devices, for reading books and watching videos. In its new tablets, Amazon stuck with the smaller form-factor and at the same time chose to market them as productivity devices for workers and consumers, while noting that many customers will use the machines for both work and personal use.

It remains to be seen how Amazon's new enterprise-ready features will resonate.

IDC and other analyst firms have noted a strong trend toward sales of smaller tablets in the 7-in. to 8-in. size to business users, and even the iPad mini, at 7.9-in. is designed to capitalize on that trend. Amazon benefits from a huge online store of products and services that will resonate with all kinds of tablet customers, analysts have said.

"Amazon hasn't had much traction with Fire tablets in the enterprise, but they're clearly targeting that group more [with HDX]," said IDC anayst Tom Mainelli. "It remains to be seen if they'll have any luck there, but they are putting the right features into the products to make that happen."



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Sunday, 22 September 2013

BlackBerry warns of disastrous Q2

BlackBerry warns of disastrous Q2
Slashing 4,500 jobs, reporting $1B loss, stock price plunges 20% in late afternoon trading

BlackBerry shares plunged 20% late Friday afternoon as the company announced plans to fire 40% of its employees and eventually cut expenditures in half over the next nine months. The actions were in response to the company’s warning of a collapse in second fiscal quarter earnings.

The company said quarterly revenues were expected to be about $1.6 billion. Yet Wall Street had been expecting over $3 billion. It also said that the quarter’s GAAP net operating loss would nearly $1 billion.

Nasdaq halted trading on the BBRY shares about 35 minutes before the preliminary figures were officially announced. When trading resumed, shares plunged 20% to $8.72.

11 not entirely useless factoids about BlackBerry maker RIM

A statement by CEO Thorsten Heins suggested BlackBerry is all but abandoning the consumer smartphone market.

“Going forward, we plan to refocus our offering on our end-to-end solution of hardware, software and services for enterprises and the productive, professional end user,” Heins said. “This puts us squarely on target with the customers that helped build BlackBerry into the leading brand today for enterprise security, manageability and reliability.”

During the quarter, about 5.9 million BlackBerry smartphones were sold through to end customers. BlackBerry didn’t break out how many were based on the new BlackBerry 10 operating system and how many were based on the prior OS platform.

The marketplace failure of its BlackBerry Z10 touch smartphone was underlined in a massive charge against inventory. BlackBerry said it will “report a primarily non-cash, pre-tax charge against inventory and supply commitments in the second quarter of approximately $930 million to $960 million, which is primarily attributable to BlackBerry Z10 devices.”

BlackBerry also said it’s changing its smartphone lineup to focus on “enterprise and prosumer-centric targeted devices, including two high-end devices and two entry-level devices in all-touch and QWERTY models.” The Z10 will target “a broader, entry-level audience.”


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Tuesday, 17 September 2013

China Struggling to Compete in the IT Outsourcing Arena

The Chinese government has made no secret of the fact that it wants to compete with IT and business process outsourcing powerhouse India on the global outsourcing stage. In 2006, the country's Ministry of Commerce unveiled it's "1,000-100-10 Project" with aimed to double China's services export by establishing 10 outsourcing hubs, attracting 100 multinationals to its shores, and developing 1,000 local vendors capable of meeting the demands of international customers.

So how's that been going? Very slowly, according to Arie Lewin, director of the Center for International Business Education and Research at Duke University's Fuqua School of Business.

The country's IT and business process services industry, which Lewin conservatively estimates is worth about $50 billion today, has seen little growth in recent years. "China has a national goal to build up this industry as new lever of economic development and this idea that they could leapfrog India," says Lewin. "But progress has been very slow."

Chalk part of it up to bad timing. "They're trying to get into an industry whose growth rate has leveled off, and it's tough to take market share away from anybody," says Lewin.

China's Services Providers Need Better Talent and Security to Compete

More fundamentally, providers in China face two other impediments: talent problems and a terrible reputation for intellectual property protection and security. Few Chinese professionals see business services as a viable career path. Providers who want to attract the best and brightest often pay a 20 percent premium on salaries, says Lewin, and there's already little labor arbitrage to be had in cities like Shanghai.

Lack of training in project management and process leaves many companies stuck doing low level work. And lack of English proficiency excludes most players from the lucrative outbound call center business.

Then there's the IP problem. Rightly or wrongly, Chinese service providers suffer "the negative consequences of organizations in China that are bombarding Western corporate Web sites and violating intellectual property," says Lewin. "Companies are not giving providers work because of this insecurity." One U.S.-based provider is spending $3 million a year to maintain its firewalls, Lewin says. "If that's what has to happen, it's unreasonable."

How Chinese Service Providers View Their Position

To find out more about how providers view the situation, Lewin along with Shanghai Jiao Tong University professor Liu Yi recently surveyed 250 providers in China, 71 percent of whom are headquartered there. What they found was a universe of undersized, young companies struggling to compete for international business.

Small providers (fewer than 500 employees) account for 87 percent of the market, according to the survey; and 79 percent of the providers had been in business for less than ten years. Just 22 percent of respondents said they had implemented Six Sigma principles, while 60 percent reported implementation of some ISO standards.

When asked about the top new services they planned to offer, 24 percent indicated software development with 12 percent answering IT infrastructure support and product design. The majority of providers also said they expected new work to more likely come from China and Asia than Europe or the U.S.

"One of the most telling themes was that, in the future, they want to focus more on the domestic industry than international clients. They realize they're not at the level of professionalism that makes them competitive for international business," Lewin says. "They're not ready to leapfrog India." That's a downshift from the more aspirational attitudes Lewin says he had seen in recent years.

Since 2006, the Chinese government has altered its approach to bolstering its services industry, identifying more than 20 cities that might be able to develop a good model that could be used throughout the country. "The approach is very Chinese," Lewin says.

But Lewin has some other suggestions for the government, such as creating incentives for ISO standards compliance. That would send a clear message about the importance of process to the business. "China doesn't have a process orientation the way India or Germany or Japan does," Lewin says. "In China, if they find a shortcut they will take it, and they will not document it. I hear it all the time."

The U.S. mid-market could also be a growth opportunity for Chinese companies, but trying to sell internationally is prohibitively expensive for small players. The Chinese government "could create representative offices in the U.S. and reduce the marketing costs for them," says Lewin.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Wednesday, 11 September 2013

Buggy Microsoft update hamstrings Outlook 2013

Folder pane goes blank after stability and performance update Tuesday; Microsoft pulls update from Windows Update and WSUS

An Office 2013 non-security update, part of yesterday's massive Patch Tuesday, blanks the folder pane in Outlook 2013, the suite's email client, drawing complaints from customers on Microsoft's support forum.

The update, identified as KB2817630, was meant to quash a several stability and performance bugs in a number of the suite's components, including Excel, SharePoint Server and Lync; fix a problem that caused Office to freeze when a document was opened in the "Protected Mode" sandbox; and more.

Instead, it emptied Outlook 2013's folder pane.

"I can't view my list of e-mail accounts, folders, favorites, etc.," said Trevor Sullivan in a message Tuesday that kicked off a long support thread.

Scores of others quickly chimed in to say the same had happened to them after applying the update on PCs running Windows 7 or Windows 8.

"Same problem on multiple fully-updated Windows 7 Enterprise Edition, Windows 8 Enterprise Edition and Windows 8.1 Enterprise Edition workstations ... all with Office 2013 32-bit," said "MiToZ" on the same thread.

Within minutes of Sullivan's post, users reported that they'd gotten the folder pane view back after uninstalling KB2817630.

Microsoft was not available for comment late Tuesday, and it has not posted any information about the glitch on its various Office-related blogs. Nor have company representatives weighed in on the support discussion thread, as they sometimes do.

However, users said that the original update had been pulled from both Windows Update and Windows Server Update Services (WSUS). The former is the patch service aimed at consumers and very small businesses, while the latter is the Microsoft-provided patch delivery and management service used by most businesses. Others reported that they'd contacted their Premier Support representatives -- a support plan available only to Microsoft's largest customers -- but had not been told when a fix would be available.

The gaffe is the latest in a series of embarrassments for Microsoft stemming from flawed updates. In August, the Redmond, Wash. company yanked an Exchange security update, saying it had not properly tested the patches. In April, Microsoft urged Windows 7 users to uninstall an update that crippled PCs with the notorious "Blue Screen of Death"; it re-released the update two weeks later.

A few users dealing with the empty folder pane bemoaned the trend.

"Yeah, another Microsoft Update Tuesday Blunder," said "Triple Helix" on the long thread.

"Someone on [Microsoft's] update testing team needs to get fired," added "The Computer Butler."

The flawed Office 2013 stability and performance update was issued yesterday alongside a 13-bulletin, 47-patch collection of security fixes that closed vulnerabilities in Windows, Internet Explorer, SharePoint, Word, Excel and Outlook.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com