Friday 18 December 2015

210-260 Implementing Cisco Network Security


QUESTION 1
Which two services define cloud networks? (Choose two.)

A. Infrastructure as a Service
B. Platform as a Service
C. Security as a Service
D. Compute as a Service
E. Tenancy as a Service

Answer:

Explanation:


QUESTION 2
In which two situations should you use out-of-band management? (Choose two.)

A. when a network device fails to forward packets
B. when you require ROMMON access
C. when management applications need concurrent access to the device
D. when you require administrator access from multiple locations
E. when the control plane fails to respond

Answer:

Explanation:


QUESTION 3
In which three ways does the TACACS protocol differ from RADIUS? (Choose three.)

A. TACACS uses TCP to communicate with the NAS.
B. TACACS can encrypt the entire packet that is sent to the NAS.
C. TACACS supports per-command authorization.
D. TACACS authenticates and authorizes simultaneously, causing fewer packets to be transmitted.
E. TACACS uses UDP to communicate with the NAS.
F. TACACS encrypts only the password field in an authentication packet.

Answer:

Explanation:


QUESTION 4
According to Cisco best practices, which three protocols should the default ACL allow on an
access port to enable wired BYOD devices to supply valid credentials and connect to the network?
(Choose three.)

A. BOOTP
B. TFTP
C. DNS
D. MAB
E. HTTP
F. 802.1x

Answer:

Explanation:


QUESTION 5
Which two next-generation encryption algorithms does Cisco recommend? (Choose two.)

A. AES
B. 3DES
C. DES
D. MD5
E. DH-1024
F. SHA-384

Answer:

Explanation:

Friday 11 December 2015

Is Microsoft about to get rid of MCSA?

On Monday, Microsoft Learning’s Born To Learn blog released some information on upcoming Windows 10 exams and related certification news — and if you read between the lines, it appears the Microsoft Certified Solutions Associate (MCSA) certification may be riding off into the sunset sometime in the near future. I’ll talk about why I think this is the case a little later on, but first let’s look at the Windows 10 exam news.

In this blog post, Microsoft announced that the first Windows 10 exam will be 70-697: Configuring Windows Devices. This exam was released in beta back in September, and is reportedly still available to candidates. If you decide to take the beta exam, be warned that it does not qualify for Microsoft’s “Second Shot” free retake promotion, and score reports won’t be issued for several weeks after the beta period ends.

For those interested in the 70-697 exam, here is a list of the knowledge domains and how much exam content is devoted to each:

● Manage identity (13 percent)
● Plan desktop and device deployment (13 percent)
● Plan and implement a Microsoft Intune device management solution (11 percent)
● Configure networking (11 percent)
● Configure storage (10 percent)
● Manage data access and protection (11 percent)
● Manage remote access (10 percent)
● Manage apps (11 percent)
● Manage updates and recovery (10 percent)

The blog post goes on to say that the second Windows 10-related exam will be 70-698: Planning for and Managing Windows Devices. This exam is still being developed, and hasn’t been released to beta yet.

There was also Windows 10 certification news for software developers. There are two new Microsoft Certified Solutions Developer (MCSD) exams currently running in beta:

● 70-354: Universal Windows Platform – App Architecture and UX/UI
● 70-355: Universal Windows Platform – App Data, Services, and Coding Patterns

If you pass both of these exams, along with exam 70-483: Programming in C#, you earn the MCSD: Universal Windows Platform certification.

Okay, now for the good stuff. Let’s talk about the MCSA, and why I think it’s going away. Eh?

Here is a list of every MCSA certification track available as of this writing:

● Windows 7
● Windows 8
● Windows Server 2008
● Windows Server 2012
● SQL Server 2008
● SQL Server 2012
● Office 365

There is no longer an MCSA track for SQL Server — the exams for the new SQL Server 2014 product were added to existing MCSE certification tracks. So, when the SQL Server 2008 and 2012 exams are eventually retired, the SQL Server MCSA tracks will be gone.

On the desktop side, the MCSA: Windows 7 certification is still available, and will likely be so well into 2016. Why? Because Windows 7 is still the most prevalent client OS among Microsoft’s enterprise customers. Windows 10 is gaining momentum, but it will take more time and testing before it takes over the business world.

That said, the MCSA: Windows 7 exams are now six years old, and Microsoft will want to retire them as soon as Windows 10 reaches a certain market share. Once this happens, the MCSA: Windows 7 track will be gone.

What about the MCSA: Windows 8 track? According to the MS Learning blog, the MCSA: Windows 8 certification is being retired on Jan. 31. The Windows 8.1 upgrade exams (70-689 and 70-692) will also be retired on that date. The two current Windows 8 MCSA exams (70-687 and 70-688) will be available until July 31 — but passing either exam will result in a Microsoft Specialist certification, not an MCSA.

Exit, MCSA: Windows 8 track. We hardly knew ye.

But, surely there will be an MCSA track for Windows 10, right? Wrong! And don’t call me Shirley. (Leslie Nielsen, FTW!)

The aforementioned smoking gun blog post states that passing one of the upcoming Windows 10 exams will earn candidates a Microsoft Specialist certification — and that these exams will be recommended prerequisites for the MCSE: Enterprise Devices and Apps track. So no, there will be no MCSA for Windows 10.

The sun may be going down on Microsoft's long-lived MCSA certification level.That just leaves us with nothing but the Windows Server and Office 365 MCSA tracks.

If the MCSA is to live on, it will most likely hang its hat on the upcoming Windows Server 2016 release. If this is not the case, however, then the MCSA for Windows Server 2008 and 2012 will eventually be retired, and that will be the end of the MCSA for Windows Server track.

That leaves the MCSA: Office 365 certification track. This oddball MCSA only contains two exams, and it isn’t hard to imagine that Microsoft would simply reclassify these exams as Specialist certifications to eliminate the MCSA: Office 365 track.

And, that’s it. If the above comes to pass, then the MCSA certification will no longer be available.

One last piece of info … in the MS Learning blog post, a commenter directly asked about the future of the MCSA certification. The response from the author of the post, Senior Product Manager for Technical Certification at Microsoft Learning Larry Kaye, was as follows:

“There are no plans to retire the MCSA level of certification at this time.” (Emphasis mine.)

So, we will see. Personally, I think there is ample evidence to demonstrate that Microsoft is at least seriously considering ending the MCSA. What do you think? Let us know in the comments below.

Wednesday 2 December 2015

IT pros average 52-hour workweek

Employees in small IT departments tend to work more hours than those in large IT departments

It’s no surprise that a majority of IT pros work more than 40 hours per week, but it’s interesting to learn that some are putting in significantly longer workweeks, according to new survey data from Spiceworks.

Among 600 IT pros surveyed, 54% said they work more than 40 hours per week. At the high end of the overtime group, 18% of respondents said they work more than 60 hours per week, and 17% said they top 50 hours per week. The average workweek among all respondents is 52 hours, Spiceworks reports.

The data comes at a time when hiring managers say it’s tough to hire experienced talent and IT pros say they’re more willing to switch jobs for a better offer. Companies claim to be boosting pay and increasing benefits and perks to entice employees – yet technical talent averages 10+ hours per day, according to the Spiceworks data.

Network jobs are hot; salaries expected to rise in 2016
When it surveyed respondents about IT staffing practices, Spiceworks hoped to find a consensus about the ideal IT staff-to-user ratio that would enable adequate incident response times without overworking IT staff. The company – which offers free management software and hosts a community for IT pros – didn’t come up with any universal formula, but it did share information about staffing trends across multiple industries and different sized companies. Here are a few of the survey findings.

Industry plays a big role in IT workload
IT pros who work in government and education are less likely to work extra hours than those in other industries. In education and government, only 33% and 37% of staff, respectively, work more than a 40-hour week.

In the construction/engineering and manufacturing industries, workweeks exceeding 50 hours are the norm. Construction/engineering is at the high end of the scale, with 72% of staff working long hours. In manufacturing, 60% of staff work more than a 40-hour week.

Large IT departments share workloads more effectively
Spiceworks found a correlation between the size of IT departments and the number of hours worked. Organizations with 40-hours-or-less workweeks tend to have larger IT departments (an average of 17 employees). Conversely, smaller IT departments tend to require more than 40 hours per week. The average overworked IT department has 10 or fewer staff members.

Helpdesk size, in particular, shapes the workload
Solving end users’ problems is one reason IT staff is overworked, Spiceworks concludes. Its survey found that IT pros in departments with more dedicated helpdesk technicians work fewer hours on average, while IT pros in departments with fewer helpdesk technicians tend to work more than 40 hours per week. Specifically, organizations with 40-hours-or-less workweeks have an average of 9 helpdesk technicians; organization with more than 40-hour workweeks have an average of 3 helpdesk technicians.

Wednesday 25 November 2015

Exam 77-418 Word 2013

Exam 77-418 Word 2013

Published: February 28, 2013
Languages: English
Audiences: Information workers
Technology: Microsoft Office 2013 suites
Credit toward certification: MOS

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft
Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.


Create and manage documents
Create a document
Creating new blank documents, creating new documents using templates, importing files, opening non-native files directly in Word, opening a PDF in Word for editing
Navigate through a document
Searching for text within document, inserting hyperlinks, creating bookmarks, using Go To
Format a document
Modifying page setup, changing document themes, changing document style sets, inserting simple headers and footers, inserting watermarks, inserting page numbers
Customize options and views for documents
Changing document views, using zoom, customizing the quick access toolbar, customizing the ribbon, splitting the window, adding values to document properties, using show/hide, recording simple macros, assigning shortcut keys, managing macro security
Configure documents to print or save
Configuring documents to print, saving documents in alternate file formats, printing document sections, saving files to remote locations, protecting documents with passwords, setting print scaling, maintaining backward compatibility

Preparation resources
Basic tasks in Word 2013
Create your first Word 2013 document (training)
Move around in a document using the navigation pane

Format text, paragraphs, and sections

Insert text and paragraphs
Appending text to documents, finding and replacing text, copying and pasting text, inserting text via AutoCorrect, removing blank paragraphs, inserting built-in fields, inserting special characters
Format text and paragraphs
Changing font attributes, using find and replace to format text, using format painter, setting paragraph spacing, setting line spacing, clearing existing formatting, setting indentation, highlighting text selections, adding styles to text, changing text to WordArt, modifying existing style attributes
Order and group text and paragraphs
Preventing paragraph orphans, inserting breaks to create sections, creating multiple columns within sections, adding titles to sections, forcing page breaks

Preparation resources
Insert fields
Copy formatting using the format painter
Add a page break

Create tables and lists
Create a table
Converting text to tables, converting tables to text, defining table dimensions, setting AutoFit options, using quick tables, establishing titles
Modify a table
Applying styles to tables, modifying fonts within tables, sorting table data, configuring cell margins, using formulas, modifying table dimensions, merging cells
Create and modify a list
Adding numbering or bullets, creating custom bullets, modifying list indentation, modifying line spacing, increasing and decreasing list levels, modifying numbering

Preparation resources

Insert a table
Convert text to a table or a table to text
Change bullet style

Apply references

Create endnotes, footnotes, and citations
Inserting endnotes, managing footnote locations, configuring endnote formats, modifying footnote numbering, inserting citation placeholders, inserting citations, inserting bibliography, changing citation styles
Create captions
Inserting endnotes, managing footnote locations, configuring endnote formats, modifying footnote numbering, inserting citation placeholders, inserting citations, inserting bibliography, changing citation styles

Preparation resources
Add footnotes and endnotes
Create a bibliography

Insert and format objects
Insert and format building blocks
Inserting quick parts, inserting textboxes, utilizing building locks organizer, customizing building blocks
Insert and format shapes and SmartArt
Inserting simple shapes, inserting SmartArt, modifying SmartArt properties (color, size, shape), wrapping text around shapes, positioning shapes
Insert and format images
Inserting images, applying artistic effects, applying picture effects, modifying image properties (color, size, shape), adding uick styles to images, wrapping text around images, positioning images

Preparation resources
Quick parts
Change the color of a shape, shape border, or entire SmartArt graphic
Move pictures or clip art

Tuesday 17 November 2015

Exam 70-697 Configuring Windows Devices (beta)

Exam 70-697 Configuring Windows Devices (beta)

Published: September 1, 2015
Languages: English
Audiences: IT professionals
Technology Windows 10
Credit toward certification: Specialist

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

Manage identity (13%)
Support Windows Store and cloud apps
Install and manage software by using Microsoft Office 365 and Windows Store apps, sideload apps by using Microsoft Intune, sideload apps into online and offline images, deeplink apps by using Microsoft Intune, integrate Microsoft account including personalization settings
Support authentication and authorization
Identifying and resolving issues related to the following: Multi-factor authentication including certificates, Microsoft Passport, virtual smart cards, picture passwords, and biometrics; workgroup vs. domain, Homegroup, computer and user authentication including secure channel, account policies, credential caching, and Credential Manager; local account vs. Microsoft account; Workplace Join; Configuring Windows Hello

Plan desktop and device deployment (13%)
Migrate and configure user data
Migrate user profiles; configure folder location; configure profiles including profile version, local, roaming, and mandatory
Configure Hyper-V
Create and configure virtual machines including integration services, create and manage checkpoints, create and configure virtual switches, create and configure virtual disks, move a virtual machine’s storage
Configure mobility options
Configure offline file policies, configure power policies, configure Windows To Go, configure sync options, configure Wi-Fi direct, files, powercfg, Sync Center
Configure security for mobile devices
Configure BitLocker, configure startup key storage

Plan and implement a Microsoft Intune device management solution (11%)
Support mobile devices
Support mobile device policies including security policies, remote access, and remote wipe; support mobile access and data synchronization including Work Folders and Sync Center; support broadband connectivity including broadband tethering and metered networks; support Mobile Device Management by using Microsoft Intune, including Windows Phone, iOS, and Android
Deploy software updates by using Microsoft Intune
Use reports and In-Console Monitoring to identify required updates, approve or decline updates, configure automatic approval settings, configure deadlines for update installations, deploy third-party updates
Manage devices with Microsoft Intune
Provision user accounts, enroll devices, view and manage all managed devices, configure the Microsoft Intune subscriptions, configure the Microsoft Intune connector site system role, manage user and computer groups, configure monitoring and alerts, manage policies, manage remote computers

Configure networking (11%)
Configure IP settings
Configure name resolution, connect to a network, configure network locations
Configure networking settings
Connect to a wireless network, manage preferred wireless networks, configure network adapters, configure location-aware printing
Configure and maintain network security
Configure Windows Firewall, configure Windows Firewall with Advanced Security, configure connection security rules (IPsec), configure authenticated exceptions, configure network discovery

Configure storage (10%)
Support data storage
Identifying and resolving issues related to the following: DFS client including caching settings, storage spaces including capacity and fault tolerance, OneDrive
Support data security
Identifying and resolving issues related to the following: Permissions including share, NTFS, and Dynamic Access Control (DAC); Encrypting File System (EFS) including Data Recovery Agent; access to removable media; BitLocker and BitLocker To Go including Data Recovery Agent and Microsoft BitLocker Administration and Monitoring (MBAM)

Manage data access and protection (11%)
Configure shared resources
Configure shared folder permissions, configure HomeGroup settings, configure libraries, configure shared printers, configure OneDrive
Configure file and folder access
Encrypt files and folders by using EFS, configure NTFS permissions, configure disk quotas, configure file access auditing Configure authentication and authorization

Manage remote access (10%)
Configure remote connections
Configure remote authentication, configure Remote Desktop settings, configure VPN connections and authentication, enable VPN reconnect, configure broadband tethering
Configure mobility options
Configure offline file policies, configure power policies, configure Windows To Go, configure sync options, configure Wi-Fi direct

Manage apps (11%)
Deploy and manage Azure RemoteApp
Configure RemoteApp and Desktop Connections settings, configure Group Policy Objects (GPOs) for signed packages, subscribe to the Azure RemoteApp and Desktop Connections feeds, export and import Azure RemoteApp configurations, support iOS and Android, configure remote desktop web access for Azure RemoteApp distribution
Support desktop apps
The following support considerations including: Desktop app compatibility using Application Compatibility Toolkit (ACT) including shims and compatibility database; desktop application co-existence using Hyper-V, Azure RemoteApp, and App-V; installation and configuration of User Experience Virtualization (UE-V); deploy desktop apps by using Microsoft Intune

Manage updates and recovery (10%)

Configure system recovery
Configure a recovery drive, configure system restore, perform a refresh or recycle, perform a driver rollback, configure restore points
Configure file recovery
Restore previous versions of files and folders, configure File History, recover files from OneDrive
Configure and manage updates
Configure update settings, configure Windows Update policies, manage update history, roll back updates, update Windows Store apps



Friday 6 November 2015

What’s behind the odd couple Microsoft-Red Hat partnership

Latest move by Microsoft to support open source technology.

No, hell has not frozen over, but yes Microsoft and Red Hat have announced a major partnership today.

In a collaboration that would have been unthinkable just a few years ago, Microsoft – the purveyor of the mainstream and proprietary Windows OS – has partnered with Red Hat, the champion of an enterprise-class iteration of Linux. And analysts say the move is good for both companies.

What’s actually happening
The meat and potatoes of this relationship is the ability to run Red Hat software – most notably its market leading Red Hat Enterprise Linux (RHEL) -- on Microsoft Azure virtual machines. This adds to Microsoft’s support in recent years of numerous Linux guest operating systems on its cloud, including those from Canonical, SUSE and Oracle.

Initially, Red Hat’s existing customer licenses will be eligible to be used on Azure, and within a couple months Azure customers will have an opportunity to spin up cloud-based versions of RHEL and pay for them as they are used, the companies said.

Amazon Web Services – Microsoft Azure’s biggest competitor in the IaaS market – has actually had on-demand and bring-your-own RHEL license for years.

There’s more to the Microsoft-Red Hat deal though. Both Microsoft Executive Vice President Scott Guthrie and Red Hat Executive Vice President of Products Paul Cormier said that this is one of the deepest partnerships that their companies have signed. Microsoft and Red Hat are organizing a team of engineers from both companies in Redmond (where Microsoft is headquartered) that will provide joint support to common customers. “There’ll be no finger pointing,” Cormier said.

No other partner has joint-engineering operations co-located on the Microsoft campus, Guthrie said.

There are a number of other, smaller parts of this deal ,too: Red Hat’s distribution of OpenStack and OpenShift – the company’s IaaS and PaaS platforms – will now support Windows OS, .net apps and Windows containers.

“All existing Red Hat development tools and Red Hat container technology can now run on Microsoft Azure,” Guthrie said. Red Hat’s CloudForms management platform, which basically controls virtual and private cloud environments, will eventually administer Azure resources. The new Red Hat on Azure services will be launched in coming weeks and months.

Building trust

“In historical terms this is a monumental announcement,” wrote Al Hilwa, IDC’s software development research director. His colleague, Al Gillen, said this move likely would not have been possible under Steve Ballmer’s reign at Microsoft.

Guthrie and Cormier, the two executives who led the partnership, said it required building up trust between the companies. “We’ve had a long history of competition and maybe there wasn’t much trust there,” Cormier said. “We decided to trust and give it a chance.”

Guthrie says the partnership should be looked at in the broader lens of moves Microsoft has made: Microsoft has worked to support Office 365 on Android and iOS; it now supports the major Linux distros on Azure (he says one-quarter of all VMs on Azure are Linux).

“I don’t view today as a complete outlier in terms of the approach or philosophy we’re trying to take,” Guthrie said on a press conference call. “But rather it’s very consistent with the openness and customer centricity that in particular Satya [Nadella] as our CEO has driven. That has really grounded our principles.”

What it means
Analysts say the move sets Microsoft up to better compete in the IaaS cloud market. “The new Microsoft has taken bold new steps and has been on a path to partner with its fiercest rivals of past years,” Hilwa wrote. “Strategically, this is what is required to be a player at scale in the cloud platform wars.”
"It’s a big win for both companies but a bigger win for Red Hat."

Red Hat customers seemed to embrace the news too. “I think it’s a big win for both companies but a bigger win for Red Hat since Microsoft is now ‘all in’ with their distribution and technologies,” says Nicholas Gerasimatos, director of cloud services engineering at FICO, a big Red Hat user.

Many organizations use Microsoft SaaS tools like Office 365 and SharePoint and use RHEL for custom business applications or in their data center. “Microsoft and Red Hat's decision to collaborate will allow their common customers to target Azure as a preferred public cloud,” says Charles King of PundIT.

Maybe that will be enough to give some customers reason to stay with Microsoft when it comes to public cloud instead of jumping to AWS.




Saturday 31 October 2015

2015 technology industry graveyard

2015 technology industry graveyard
Cisco, Microsoft, Google and others bury outdated technologies to move ahead with new ones.

Ba-bye
The Technology Industry Graveyard is pretty darn full in 2015, and we’re not even including the near-dead such as RadioShack and Microsoft’s IE browser. Pay your respects here…

GrooveShark
The self-described “World’s Music Library” is no more after shutting down in April in the wake of serious legal pressure by music companies whose songs GrooveShark allowed to be shared but had never licensed. Apple and Google had each kicked GrooveShark out of their app stores years ago due to complaints from music labels. Much more sadly than the 9-year-old company’s demise, however, was the death of co-founder Josh Greenberg in July at the age of just 28.

Typo iPhone keyboard
Not even the glamor of being co-founded by American Idol host Ryan Seacrest could help Typo Innovations save its iPhone keyboard, which BlackBerry said infringed on its patents. So instead, Typo bailed on the iPhone model and settled for selling ones for devices with screens 7.9-inches or larger (like iPads).

Amazon Fire Phone
With a product name like Fire, you’re just asking for colorful headlines if it bombs. And indeed, Amazon has stopped making its Fire Phone about a year after introducing it and media outlets were quick to highlight the company “extinguishing” it or remarking on the phone being “burnt out.” Amazon has had some success on the hardware front, namely with its Kindle line, but the Fire just didn’t distinguish itself and was going for free with a carrier contract by the end.

Interop New York
Interop Las Vegas carries on as one of the network industry’s top trade shows next May, but little sibling Interop New York is no more this year. The Fall show, traditionally held at the Javits Center since 2005, was always smaller and was discontinued for 2015 despite lively marketing material last year touting “More Than 30 Interop New York Exhibitors and Sponsors to Make Announcements in Anticipation of the Event.”

GTalk
Google ditched so many things in 2015 that we devoted an entire slideshow to Google’s Graveyard. So to choose just one representative item here, we remember Google Talk, which had a good run, starting up in 2005. But it’s never good when Google pulls out the term “deprecated” as it did in February in reference to this chat service’s Windows App. Google said it was pulling the plug on GTalk in part to focus on Google Hangouts in a world where people have plenty of other ways to chat online. However, Google Talk does live on via third-party apps.

Cisco Invicta storage products
Cisco has a good touch when it comes to acquisitions, but its $415 mlllion WHIPTAIL buyout from 2013 didn’t work out. The company in July revealed it had pulled the plug on its Invicta flash storage appliances acquired via that deal. It’s not unthinkable though that Cisco could go after another storage company, especially in light of the Dell-EMC union.

RapidShare
The once-popular file hosting system, begun in 2002, couldn’t withstand the onslaught of competition from all sides, including Google and Dropbox. Back in 2009, the Switzerland-based operation ran one of the Internet’s 20 most visited websites, according to Wikipedia. It shut down on March 31, and users’ leftover files went away with it.

Windows RT devices
This locked-down Microsoft OS for tablets and convertible laptops fared about as well as Windows 8, after being introduced as a prototype in 2011 at the big CES event in Las Vegas. Microsoft’s software for the 32-bit ARM architecture was intended to enable devices to exploit that architecture’s power efficiency, but overall, the offering proved to be a funky fit with existing Windows software. Production of RT devices stopped earlier in 2015 as Microsoft focuses on Win10 and more professional-focused Surface devices.

OpenStack vendor Nebula
As Network World’s Brandon Butler wrote in April, Nebula became one of the first casualties of the open source OpenStack cloud computing movement when it shuttered its doors. The company, whose founder was CIO for IT at NASA before starting Nebula in 2011, suggested in its farewell letter that it was a bit ahead of its time, unable to convert its $38 million in funding and hardware/software appliances into a sustainable business.

FriendFeed
Facebook bought this social news and information feed aggregator in 2009, two years after the smaller business started, and then killed it off in April. People have moved on to other means of gathering and discovering info online, so FriendFeed died from lack of use. It did inspire the very singular website, Is FriendFeed Dead Yet, however, so its legacy lives on.

Apple Aperture
Apple put the final nails in its Aperture photo editing app in 2015, ending the professional-quality post-production app’s 10-year run at Version 3.6. In its place, Apple introduced its Photos app for users of both its OS X Mac and iOS devices.

Secret
One of the co-founders of anonymous sharing app shared this in April: The company was shutting down and returning whatever part of its $35 million in funding was left. The company’s reality was just not going to meet up with his vision for it, said co-founder David Byttow. The company faced criticism that it, like other anonymous apps such as Yik Yak, allowed for cyberbullying.

Amazon Wallet
Amazon started the year by announcing its Wallet app, the company’s 6-month-old attempt to get into mobile payments, was a bust. The app, which had been in beta, allowed users to store their gift/loyalty/rewards cards, but not debit or credit cards as they can with Apple and Google mobile payment services.

Circa News app
Expired apps could easily fill an entire tech graveyard, so we won’t document all of their deaths here. But among them not making it through 2015 was Circa, which reportedly garnered some $4 million in venture funding since starting in 2012 but didn’t get enough takers for its app-y brand of journalism.


Tuesday 27 October 2015

Aruba succeeded where other Wi-Fi companies failed: A talk with the founder about the acquisition by HP, the future of Wi-Fi

Wireless LAN stalwart Aruba was acquired by HP last March for $3 billion, so Network World Editor in Chief John Dix visited Aruba co-founder Keerti Melkote to see how the integration is going and for his keen insights on the evolution of Wi-Fi. Melkote has seen it all, growing Aruba from a startup in 2002 to the largest independent Wi-Fi company with 1,800 employees. After Aruba was pulled into HP he was named CTO of the combined network business, which employs roughly 5,000. In this far ranging interview Melkote talks about product integration and rationalization, the promise of location services and IoT, the competition, the arrival of gigabit Wi-Fi and what comes next.

Why sell to HP?
Aruba was doing really well as a company. We gained market share through every technology transition -- from 802.11a to “b” to “g” and "n" and now “ac” -- and today we’re sitting at roughly 15% global share and have a lot more than that in segments like higher education and the federal market. But we were at a point where we could win more if we had an audience at the CIO level, and increasingly we were getting exposed to global projects that required us to have a large partner in tow to give us the people onsite to execute on a worldwide basis.

So we began looking for what internally we called a big brother to help us scale to that next level. We talked to the usual suspects in terms of professional services, consulting companies, etc., but then HP approached us and said they were interested in partnering with us to go after the campus market, which is changing from wired to wireless.

HP has a good history on the wired side, so we felt this was an opportune moment to bring the sides together, but go to market with a mobile-first story. After all, as customers re-architect their infrastructure they’re not going with four cable drops to every desk, they’re looking at where the traffic is, which is all on the wireless networks these days. HP agreed with that and basically said, “Why don’t you guys come in and not only grow Aruba, but take all of networking within HP and make it a part of the whole ecosystem.”

So HP Networking and Aruba have come together in one organization and Dominic Orr [formerly CEO of Aruba] is the leader for that and I am Chief Technology Officer. We are focusing on integrating the Aruba products with the HP network products to create a mobile-first campus architecture.

Does the Aruba name go away and does everyone move to an HP campus?
No, and there is some exciting news there. The go-forward branding for networking products in the campus is going to be Aruba, including the wire line products. Over time you will start to see a shift in this mobile-first architecture with Aruba switching also coming to market.
Think Big. Scale Fast. TRANSFORM. Enter Blue Planet.nagement...

Will that include the HP Networking operations in the area?
No, we have a global development model, so we have development sites here in Sunnyvale, Palo Alto and Roseville. And we have sites in India, China, Canada and in Costa Rica. There won’t be any changes to any of the development sites. As the business grows we’re going to have to grow most of those sites.

HP has bought other wireless players along the way, including Colubris and 3Com, so how does it all fit together?
Colubris was a pretty focused wireless acquisition back in 2008 and those products have done well for HP, but that customer base is ready for upgrades to 11ac and as they upgrade they will migrate to Aruba. The former product line will be end-of-lifed over time, but we’re not going to end support for it. There is a small team supporting it and will continue to do so until customers are ready to migrate.

3Com was a much broader acquisition, involving data center campus products, routing, etc. Most of the R&D for 3Com is in China with H3C [the joint venture 3Com formed with Huawei Technologies before 3Com was acquired by HP in 2010]. There is a two-prong go-to-market approach for those products. There is a China go-to-market, which has done really well. In fact, they are number one, even ahead of Cisco, from an overall network market share perspective in China. For the rest of the world we were using the products to go after the enterprise.

As you probably heard recently, we are going to sell 51% of our share in H3C to a Chinese owned entity because there needs to be Chinese ownership for them to further grow share. H3C will be an independent entity on the Chinese stock market and will sell networking gear in China and HP servers and storage as well.

So that becomes our way to attack the China market while we will continue to sell the other network products to the rest of the world. Those products are doing very well, especially in the data center. They run some of the largest data centers in the world, names that are less familiar here in the U.S., but very large data centers for the likes of Alibaba, Tencent and other companies that are basically the Amazons and Facebooks of China.

3Com has a wireless portfolio called Unified Wireless. That product line will also be end-of-lifed but still supported, and as we migrate to next-generation architectures we will position Aruba for those buyers. The definitive statement we’ve made is Aruba will be the wireless LAN and mobility portfolio in general and Hewlett-Packard’s network products will be the go-forward switching products.

Two products that are really helping to integrate our product lines are: ClearPass, which is our unified policy management platform, which is going to be the first point where access management is integrated between wired and wireless; and AirWave, which is the network management product which will become the single console for the customer to manage the entire campus network. For the data center we will have a different strategy because data center management is about integrating with servers and storage and everything else, but for the campus the AirWave product will be the management product.

3Com has a product called IMC Intelligent Management Console that will continue if customers need deep wired management, but if you need to manage a mobile-first campus, AirWave will do the complete job for you.

Given your longevity and perspective in the wireless LAN business, are we where you thought we would be in terms of Wi-Fi usage when you first started on this path 13 years ago?
It’s taken longer than I thought it would, but it has certainly far surpassed my expectations. Back in 2002 there was no iPhone or iPad. Wireless was for mobile users on laptops and we believed it would become the primary means of connecting to the network and you would no longer need to cable them in. That was the basic bet we made when we started Aruba. My hope was we would get there in five to seven years and it took 15, but things always take a little bit longer than you think.

The seminal moment in our business was the introduction of the iPad. Even though the iPhone was around most people were still connecting to the cellular network and not Wi-Fi because of the convenience. Laptop-centric networking was still prominent, but when the iPad arrived there was no way to connect it to the wire and there were all sorts of challenges. How do you provide pervasive wireless connectivity, because the executives that brought them in were taking them along wherever they went. Security was a big challenge because they were all personal devices.

We had developed and perfected answers for those questions over the years so it was all sort of right there for us. And the last five years has seen dramatic changes in terms of all-wireless offices, open office space architectures, etc. Microsoft Lync was also a big inflection point as well.

Why is that?
Whenever I talk to customers about pulling the cable out they always point to the phone and say, “I still need to pull a cable for that, which means I need power over Ethernet, I need an Ethernet switch in the closet, I need a PBX.” But when Lync was introduced in 2013 you could get your unified communications on your smart phone. Today, if you were to ask what is the most important device on the network, I’d say it’s the smart phone because it’s converging the computing and messaging and everything else on one device. Now you can provide a rich experience on a mobile device and do it anywhere, anytime.

Where do we stand on location-based services?
We’ve been talking about location services for a very long time. What happened was Wi-Fi based location alone wasn’t actually solving the problem. It was giving you a sense of where people were in a facility, but getting the technology to allow you to engage with somebody in physical space was not working, mostly because the operating systems on those mobile devices weren’t supporting Wi-Fi for location, just connectivity.

We have now integrated Bluetooth Low Energy (BLE) into our portfolio so you have two ways of connecting with the user; the Wi-Fi side gives you presence and Bluetooth Low Energy gives you the ability to engage on the user side so you can send notifications about where they are. That technology lets us provide tools for marketers, for retailers to send coupons, invite people into a store, and so on.

So it is finally picking up some?
It is. Actually Asia is doing well. There is a lot of construction in Asia and this is one of the demands. But the U.S. is picking up. We just implemented a large network at the Levi’s Stadium right down the street here [which recently replaced Candlestick Park as home of the San Franciso 49ers].

One of the things the CEO imagined was that, as you drive from home to the game, their app would guide your experience. So they’ll take you to the right parking lot, then provide you directions to your seat, and once you are in the seat enjoying the game they wanted to provide amenities -- so food and beverage ordering and the ability to watch instant replays and the like. All these things are available for a fee of course. In the first season of operation this app generated $2 million of additional sales for Levi’s Stadium.

That was a big win for us, not just for demonstrating high density Wi-Fi where we have seen regularly 3-4 gig of traffic going to the Internet, but also showing the revenue generating potential of location-based technology.

Speaking of networking a lot of things, what do you make of the Internet of Things movement?
Eventually where it all goes is integrating the Internet of Things. Every day I interact with customers there are new use cases coming up around the intersection of location-based technology and the Internet of Things. And that’s squarely in the purview of what we are doing. It’s not today. Today is still about this all-wireless workplace, but in the next five years I think you’ll see a lot more of this. There is a lot of innovation still to come.

There’s a hodgepodge of stuff used to connect sensors today, but you see Wi-Fi playing a prominent role?
Wi-Fi will definitely be an integral component, but Bluetooth Low Energy will also be important because some sensors will be battery operated. There may be a role for the evolution of ZigBee as well. That’s super low energy. ZigBee is not yet in the mainstream enterprise but I can see some successor of that happening. But sensors will look to wireless for connectivity because they need to go anywhere. You can’t have cable follow them. So the wireless fabric is becoming super-critical for that.

Switching gears a bit, how is competition changing?
We look at three key market segments: large and medium enterprises; small/medium businesses, which have completely different characteristics; and service providers. Aruba has done really well in the large and medium enterprise segment. We have done reasonably well in the small/medium segment, but there is more competition there. Ruckus has done well there. And service provider is the emerging battleground.

As a standalone company Aruba couldn’t afford to invest, frankly, in all three segments. We were focused on the large and medium enterprise and we built a good franchise. Clearly Cisco is the primary competitor there, but now as part of HP we have another go-to-market capability and investment to take on all three segments in a meaningful way, so that’s another big reason why we came together.

We just recently announced a partnership with Ericsson to go after the service provider Wi-Fi segment, and that will help us gain share. And HP has been a strong player in the small/medium business so we’re going to take Aruba down-market. We’re going to play in all three segments. I feel if we just keep executing, market share gains are possible.

Ruckus talks about optimizing the airwaves as being their key differentiator. How do you differentiate Aruba?
The four key things I talk about are the emergence of the all-wireless workplace, inflight communications and voice, the need for deep security within your own device, and the need for location based services and training towards IoT.

We talked about the all-wireless workplace and location services. Regarding voice traffic, we have invested quite a bit of energy ensuring optimal utilization. Ruckus focused on the antenna technology, while we are focused on the software that goes on top of the antenna. The analogy I’ll give you is, as you walk away from an access point I can boost my antenna power to give you a better signal, and that problem is a good problem to solve if you’re in a home because you only have one access point. But in the enterprise there is a collection of access points and the problem isn’t about holding onto a client for as long as possible, but to move the client to the best access point. So the trick is to enable the client to roam from one access point to another in a very efficient way. We call this technology ClientMatch. That is the core differentiator for us over the air, and we’ve specifically optimized it for voice by working with the Microsoft team to enable Lync and Skype for Business.

Security is a place we cannot be touched. We’ve had deep security expertise for a very long time. The DoD, three of the armed forces, most of the federal market actually, uses Aruba. I can’t get into all the details, but we have significant penetration because of our security depth. For enterprises that is a big deal. They really want to make sure the security side is well covered.

What’s the hot button in wireless security today?
We know how to encrypt. We know how to authenticate. Basically it is the threat of an unmanaged device coming into the network. We’re looking at solving that problem as a mobile security problem and we solved one part of it with access management, but we have this Adaptive Trust architecture which integrates with mobile device management tools -- VMware’s AirWatch, MobileIron, Microsoft’s Intune. We partner with those companies and the likes of Palo Alto Networks, and HP now brings its security and management platform ArcSight to the table. The idea is to secure the mobile edge so no matter where you are you have a secure connection back to the enterprise.

Let’s shift to the adoption of Gigabit Wi-Fi, or 802.11ac. How is that transition going?
The campus access network from your desktop to the closet has stagnated for a long time. That’s because there was really nothing driving the need for more than a gigabit’s worth of bandwidth to the desktop. Now with Gigabit Wi-Fi technologies the over the air rates are greater than if you were to connect to the wired LAN. So if you deploy Gigabit Wi-Fi and have signals going at 2G, let’s say, the wired line becomes a bottleneck. There is a technology called Smart Rate that HP Networking introduced for its switches which allows you to raise the data rates to 2.5Gbps and even 5Gbps. At that point your access points don’t have to contend with the bottleneck and can pick up the bits over the air and put them on the wire without dropping them.

So you need will need wired ports faster than a gigabit as you transition to this mobile workplace, but you won’t need as many ports as before. That is a transition, I think, that will happen over the next 2-3 years.

Did many people buy into Wave 1 of Gigabit Wi-Fi or did they hold off?
We’ve had tremendous success with Wave 1. The need for bandwidth is truly insatiable. And there is a ton of demand still yet to be put on the network. Video is a significant driver of bandwidth and most companies are throttling it video. So the more you open the pipe, the more capacity I think people will consume. Wave 1 has done very well. I think Wave 2 will continue to do well and then there’s .11ax which will take capacity even higher.

So people bought into Wave 1 even though Wave 2 requires them to replace hardware?

I tell customers, if you’re going to wait for the next best thing you’re going to wait forever, because there’s always going to be the next best thing on the horizon. So it’s really a question of where you are in your lifecycle for an investment. If the customer is at a point where they’ve had five years of investment and they’re hurting, it’s a good time. Wave 1 can actually solve a lot of problems. There’s no need to wait another 18 months for Wave 2 technology. You know you’re going to refresh that too in five years and there will be new technology at that point in time.

Will anybody buy anything but Wave 2 at this point?
It depends. Wave 1 technology you can buy today at multiple price points in the industry. Wave 2 is still at the very top end of the range. So if you’re looking for, let’s say, lighting up a retail store and you don’t need all the capacity of Wave 2, then Wave 1 will do just fine. That’s typical of most technologies, to start at the top and eventually work its way down. We’re right in the beginning of the Wave 2 transition.

How about in carpeted office space? Would you just drop Wave 2 into key points to satisfy demand?
Wi-Fi has always basically been single user. Only one user could speak on a wireless LAN at a time. With Wave 2 you can have multiple conversations at the same time; each access point can serve four streams. So that boosts capacity in a significant way and can also improve spectrum efficiency. For that reason alone, I think Wave 2 should be used pretty much anywhere you go. You could start with a high density zone and then work your way up. That’s typically how people do it, but I would encourage most customers to take advantage of this technology.

In the industry we’ve always used speed as a measure of the next generation of technology. Never have we given attention to efficiency. This is the first time where we’re saying efficiency gains are pretty significant.

And Wave 2 ultimately will be able to support up to eight streams, right?
Yes, the technology allows you to do eight streams, although it is not possible to pack eight antennas into the form factor at this point. But it will come.

I think the targets are up to 10 gig. Let’s see how far they get. At that point, the Gigabit Ethernet backhaul will become an even more interesting problem. You’ll need 10 gig of backhaul from the access point.

In terms of the coming year, what should people look for?
They should expect a streamlined roadmap with unified management for wired and wireless, and unified security for wired and wireless in the campus. And they should expect changes in wiring closet switches to support Wave 2.

The other piece cooking in the labs is the next-generation controller technology. We invented the controller back in 2002 and that has gone through multiple generations of upgrades. The first controller had something like a 2Gig back plane that could support 1,000 users, and now we have a 40G controller that supports 32,000 users. So how do you get from there to 500G? That will require us to rethink architecture because these campuses are getting there.

We used to talk about tens of thousands of devices on a campus. Today campuses have hundreds of thousands of devices. How do you support them in a single architecture? Right now you add more controllers, but that creates a management problem. We are working on a unified solution for very large campuses and taking it to the next level for service providers as well.

Tuesday 13 October 2015

70-332 Advanced Solutions of Microsoft SharePoint Server 2013

QUESTION 01
You need to ensure that the developers have the necessary permissions to meet the BCS model
requirements. What should you do?

A. Grant Edit permissions to the developers by using the Set Object Permissions option.
B. Grant Execute permissions to the developers by using the Set Object Permissions option.
C. Grant Edit permissions to the developers by using the Set Metadata Store Permissions option.
D. Grant Execute permissions to the developers by using the Set Metadata Store Permissions
option.

Correct Answer: C

QUESTION 02
You need to configure Excel Services. What should you do?

A. Add a trusted file location to the Certkingdom360 site.
B. Add each user as a Viewer.
C. Add each user as a Contributor.
D. Add a trusted data connection library to the Certkingdom360 site.

Correct Answer: A

QUESTION 56
You need to configure the BCS model to access data. What should you do?

A. Create an external content type and enter the target application friendly name in the Secure
Store Application ID field
B. Create an external content type and enter the target application ID in the Secure Store
Application ID field.
C. Create an external content type and choose the Connect with impersonated custom identity
option. Enter the target application friendly name of the Secure Store target application.
D. Create an external content type and choose the Connect with user's identity option.

Correct Answer: B

QUESTION 03
You need to meet the site availability requirements. What should you do?

A. Configure each web server as a node of a Network Load Balancing (NLB) cluster.
B. Create an alternate access mapping entry for each server
C. Create client-side host entries to point to specific servers.
D. Create Request Management rules to route traffic to each server.

Correct Answer: A

Friday 9 October 2015

70-247 Configuring and Deploying a Private Cloud with System Center 2012

QUESTION 1
You have a System Center 2012 Virtual Machine Manager (VMM) infrastructure that contains a
server named Server1. Server1 hosts the VMM library. You add a server named Server2 to the
network. You install the Windows Deployment Services (WDS) server role on Server2. You have the
Install.wim file from the Windows Server 2008 R2 Service Pack 1 (SP1) installation media. You need
to install Hyper-v hosts by using the bare-metal installation method. What should you do first?

A. Add Install.wim to the VMM library.
B. Convert Install.wim to a .vhd file.
C. Convert Install.wim to a .vmc file.
D. Add Install.wim to the Install Images container.

Answer: B


QUESTION 2
You have a System Center 2012 Virtual Machine Manager (VMM) infrastructure that contains a
visualization host named Server2. Server2 runs Windows Server 2008 R2 Service Pack 1 (SP1).
Server2 has the Hyper-V server role installed. You plan to deploy a service named Service1 to
Server2. Service1 has multiple load-balanced tiers. You need to recommend a technology that must
be implemented on Server2 before you deploy Service1. What should you recommend?

A. MAC address spoofing
B. the Network Policy and Access Services (NPAS) server role
C. TCP Offloading
D. the Multipath I/O (MPIO) feature

Answer: A


QUESTION 3
Your network contains a server named Server1 that has System Center 2012 Virtual Machine
Manager (VMM) installed. You have a host group named HG1. HG1 contains four virtualization
hosts named Server2, Server3, Server4, and Servers. You plan to provide users with the ability to
deploy virtual machines by using the Self-Service Portal. The corporate management policy states
that only the members of a group named Group1 can place virtual machines on Server2 and
Server3 and only the members of a group named Group2 can place virtual machines on Server4
and Server5. You need to recommend a cloud configuration to meet the requirements of the
management policy. What should you recommend?

A. Create two clouds named Cloud1 and Cloud2. Configure the custom properties of each cloud.
B. Create a host group named HG1\HG2. Create one cloud for HG1 and one cloud for HG2. Move
two servers to HG2.
C. Create two clouds named Cloud1 and Cloud2. Configure placement rules for HG1.
D. Create two host groups named HG1\Group1 and HG1\Group2. Create one cloud for each new
host group. Move two servers to each host group.

Answer: D


QUESTION 4
Your company has a private cloud that contains 200 virtual machines. The network contains a
server named Server1 that has the Microsoft Server Application Virtualization (Server App-V)
Sequencer installed. You plan to sequence, and then deploy a line-of-business web application
named App1. App1 has a Windows Installer package named Install.msi. App1 must be able to store
temporary files. You need to identify which task must be performed on Server1 before you deploy
App1. What task should you identify?

A. Modify the environment variables.
B. Add a script to the OSD file.
C. Compress Install.msi.
D. Install the Web Server (IIS) server role.

Answer: D
QUESTION 174
Your company has three datacenters located in New York, Los Angeles and Paris. You deploy a
System Center 2012 Virtual Machine Manager (VMM) infrastructure. The VMM infrastructure
contains 2,000 virtual machines deployed on 200 Hyper-V hosts. The network contains a server
named DPM1 that has System Center 2012 Data Protection Manager (DPM) installed.
You need to recommend a solution for the infrastructure to meet the following requirements:
* Automatically backup and restore virtual machines by using workflows.
* Automatically backup and restore system states by using workflows.
What should you include in the recommendation? (Each correct answer presents part of the
solution. Choose two.)

A. Deploy System Center 2012 Orchestrator.
B. Install the Integration Pack for System Center Virtual Machine Manager (VMM).
C. Install the Integration Pack for System Center Data Protection Manager (DPM).
D. Deploy System Center 2012 Operations Manager.
E. Deploy System Center 2012 Service Manager.

Answer: AB


QUESTION 5
You are the datacenter administrator for a company named CertKingdom, Ltd. The network contains a
server that has System Center 2012 Virtual Machine Manager (VMM) installed. You create four
private clouds. Developers at CertKingdom have two Windows Azure subscriptions. CertKingdom creates a
partnership with another company named A.Datum. The A.Datum network contains a System
Center 2012 Virtual Machine Manager (VMM) infrastructure that contains three clouds.
Developers at A.Datum have two Windows Azure subscriptions. You deploy System Center 2012
App Controller at A.Datum. You plan to manage the clouds and the Windows Azure subscriptions
for both companies from the App Controller portal. You need to identify the minimum number of
subscriptions and the minimum number connections required for the planned management. How
many connections and subscriptions should you identify?

A. Two connections and four subscriptions
B. Two connections and two subscriptions
C. Four connections and four subscriptions
D. Eight connections and four subscriptions
E. Four connections and two subscriptions

Answer: A


QUESTION 6
Your network contains an Active Directory forest named CertKingdom.com. The forest contains a System
Center 2012 Operations Manager infrastructure. Your company, named CertKingdom, Ltd., has a partner
company named
A. Datum Corporation. The
A. Datum network contains an Active Directory forest
named adatum.com. Adatum.com does not have any trusts. A firewall exists between the
A. Datum
network and the CertKingdom network. You configure conditional forwarding on all of the DNS servers
to resolve names across the forests. You plan to configure Operations Manager to monitor client
computers in both of the forests. You need to recommend changes to the infrastructure to monitor
the client computers in both of the forests. What should you include in the recommendation? (Each
correct answer presents part of the solution. Choose two.)

A. Allow TCP port 5723 on the firewall.
B. Deploy a gateway server to adatum.com.
C. Create a DNS zone replica of adatum.com.
D. Allow TCP port 5986 on the firewall.
E. Create a DNS zone replica of CertKingdom.com.
F. Deploy a gateway server to CertKingdom.com.

Answer: AB

Thursday 17 September 2015

More automation, fewer jobs ahead

Internet of Things in 2025: The good and bad

Within 10 years, the U.S. will see the first robotic pharmacist. Driverless cars will equal 10% of all cars on the road, and the first implantable mobile phone will be available commercially.

These predictions, and many others, were included in a World Economic Forum report, released this month. The "Technological Tipping Points Survey" is based on responses from 800 IT executives and other experts.

A tipping point is the moment when specific technological shifts go mainstream. In 10 years, many technologies will be widely used that today are in pilot or are still new to the market.
INSIDER: 5 ways to prepare for Internet of Things security threats

The Internet of Things will have a major role. Over the next decade there will be one trillion sensors allowing all types of devices to connect to the Internet.

Worldwide, the report estimates, 50 billion devices will be connected to the Internet by 2020. To put that figure in perspective, the report points out, the Milky Way -- the earth's galaxy -- contains about 200 billion suns.

The ubiquitous deployment of sensors, via the Internet of Things, will deliver many benefits, including increases in efficiency and productivity, and improved quality of life. But its negative impacts include job losses, particularly for unskilled labor as well as more complexity and loss of control.

Robotics, too, will be a mixed bag. It will return some manufacturing back to the U.S., as offshore workers are replaced with onshore robots. But robotics -- including the first robotic pharmacist -- will result in job losses as well.

There's concern that "we are facing a permanent reduction in the need for human labor," said the report.

That may still be an outlier view. Efficiency and productivity gains have historically increased employment. But a shift may be underway.

"Science fiction has long imagined the future where people no longer have to work and could spend their time on more noble pursuits," the report said. "Could it be that society is reaching that inflection point in history?"

That question doesn't have a clear answer. The Industrial Revolution destroyed some jobs but created many more, the report points out. "It can be challenging to predict what kinds of jobs will be created, and almost impossible to measure them," the report notes.

Other predictions included:
Driverless cars will make up one in 10 of the vehicles on the road, and this will improve safety, reduce stress, free up time and give older and disabled people more transportation options. But driverless vehicles may also result in job losses, particularly in the taxi and trucking industries.

One in 10 people will be wearing connected clothing in 10 years. Implantable technologies will also be more common, and may be as sophisticated as smartphones. These technologies may help people self-manage healthcare as well as lead to a decrease in missing children. Potential negatives include loss of privacy and surveillance issues.
The forecasters were bullish on vision technologies over the next decade. This is tech similar to Google glass that enhances, augments and provides "immersive reality." Eye tracking technologies, as well, will be used as a mean of interaction.

Unlimited free storage that's supported by advertising is expected by 2018.


Saturday 5 September 2015

Microsoft, U.S. face off again over emails stored in Ireland

The company has refused to turn over to the government the emails stored in Ireland

A dispute between Microsoft and the U.S. government over turning over emails stored in a data center in Ireland comes up for oral arguments in an appeals court in New York on Wednesday.

Microsoft holds that an outcome against it could affect the trust of its cloud customers abroad as well as affect relationships between the U.S. and other governments which have their own data protection and privacy laws.

Customers outside the U.S. would be concerned about extra-territorial access to their user information, the company has said. A decision against Microsoft could also establish a norm that could allow foreign governments to reach into computers in the U.S. of companies over which they assert jurisdiction, to seize the private correspondence of U.S. citizens.

The U.S. government has a warrant for access to emails held by Microsoft of a person involved in an investigation, but the company holds that nowhere did the U.S. Congress say that the Electronics Communications Privacy Act "should reach private emails stored on providers’ computers in foreign countries."

It prefers that the government use "mutual legal assistance" treaties it has in place with other countries including Ireland. In an amicus curiae (friend of the court) brief filed in December in the U.S. Court of Appeals for the Second Circuit, Ireland said it “would be pleased to consider, as expeditiously as possible, a request under the treaty, should one be made.”

A number of technology companies, civil rights groups and computer scientists have filed briefs supporting Microsoft.

In a recent filing in the Second Circuit court, Microsoft said "Congress can and should grapple with the question whether, and when, law enforcement should be able to compel providers like Microsoft to help it seize customer emails stored in foreign countries."

"We hope the U.S. government will work with Congress and with other governments to reform the laws, rather than simply seek to reinterpret them, which risks happening in this case," Microsoft's general counsel Brad Smith wrote in a post in April.

Lower courts have disagreed with Microsoft's point of view. U.S. Magistrate Judge James C. Francis IV of the U.S. District Court for the Southern District of New York had in April last year refused to quash a warrant that authorized the search and seizure of information linked with a specific Web-based email account stored on Microsoft's premises.

Microsoft complied with the search warrant by providing non-content information held on its U.S. servers but filed to quash the warrant after it concluded that the account was hosted in Dublin and the content was also stored there.

If the territorial restrictions on conventional warrants applied to warrants issued under section 2703 (a) of the Stored Communications Act, a part of the ECPA, the burden on the government would be substantial, and law enforcement efforts would be seriously impeded, the magistrate judge wrote in his order. The act covers required disclosure of wire or electronic communications in electronic storage.

While the company held that courts in the U.S. are not authorized to issue warrants for extraterritorial search and seizure, Judge Francis held that a warrant under the Stored Communications Act, was "a hybrid: part search warrant and part subpoena." It is executed like a subpoena in that it is served on the Internet service provider who is required to provide the information from its servers wherever located, and does not involve government officials entering the premises, he noted.

Judge Loretta Preska of the District Court for the Southern District of New York rejected Microsoft's appeal of the ruling, and the company thereafter appealed to the Second Circuit.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday 31 August 2015

10 security technologies destined for the dustbin

Systemic flaws and a rapidly shifting threatscape spell doom for many of today’s trusted security technologies

Perhaps nothing, not even the weather, changes as fast as computer technology. With that brisk pace of progress comes a grave responsibility: securing it.

Every wave of new tech, no matter how small or esoteric, brings with it new threats. The security community slaves to keep up and, all things considered, does a pretty good job against hackers, who shift technologies and methodologies rapidly, leaving last year’s well-recognized attacks to the dustbin.

Have you had to enable the write-protect notch on your floppy disk lately to prevent boot viruses or malicious overwriting? Have you had to turn off your modem to prevent hackers from dialing it at night? Have you had to unload your ansi.sys driver to prevent malicious text files from remapping your keyboard to make your next keystroke reformat your hard drive? Did you review your autoexec.bat and config.sys files to make sure no malicious entries were inserted to autostart malware?

Not so much these days -- hackers have moved on, and the technology made to prevent older hacks like these is no longer top of mind. Sometimes we defenders have done such a good job that the attackers decided to move on to more fruitful options. Sometimes a particular defensive feature gets removed because the good guys determined it didn't protect that well in the first place or had unexpected weaknesses.

If you, like me, have been in the computer security world long enough, you’ve seen a lot of security tech come and go. It’s almost to the point where you can start to predict what will stick and be improved and what will sooner or later become obsolete. The pace of change in attacks and technology alike mean that even so-called cutting-edge defenses, like biometric authentication and advanced firewalls, will eventually fail and go away. Surveying today's defense technologies, here's what I think is destined for the history books.

Biometric authentication is tantalizing cure-all for log-on security. After all, using your face, fingerprint, DNA, or some other biometric marker seems like the perfect log-on credential -- to someone who doesn't specialize in log-on authentication. As far as those experts are concerned, it’s not so much that biometric methods are rarely as accurate as most people think; it's more that, once stolen, your biometric markers can't be changed.

Take your fingerprints. Most people have only 10. Anytime your fingerprints are used as a biometric logon, those fingerprints -- or, more accurately, the digital representations of those fingerprints -- must be stored for future log-on comparison. Unfortunately, log-on credentials are far too often compromised or stolen. If the bad guy steals the digital representation of your fingerprints, how could any system tell the difference between your real fingerprints and their previously accepted digital representations?

In that case, the only solution might be to tell every system in the world that might rely on your fingerprints to not rely on your fingerprints, if that were even possible. The same is true for any other biometric marker. You'll have a hard time repudiating your real DNA, face, retina scan, and so on if a bad player gets their hands on the digital representation of those biometric markers.

That doesn’t even take into account issues around systems that only allow you to logon if you use, say, your fingerprint when you can no longer reliably use your fingerprint. What then?

Biometric markers used in conjunction with a secret only you know (password, PIN, and so on) are one way to defeat hackers that have your biometric logon marker. Of course mental secrets can be captured as well, as happens often with nonbiometric two-factor log-on credentials like smartcards and USB key fobs. In those instances, admins can easily issue you a new physical factor and you can pick a new PIN or password. That isn't the case when one of the factors is your body.

While biometric logons are fast becoming a trendy security feature, there's a reason they aren’t -- and won't ever be -- ubiquitous. Once people realize that biometric logons aren't what they pretend to be, they will lose popularity and either disappear, always require a second form of authentication, or only be used when high-assurance identification is not needed.

Doomed security technology No. 2: SSL

Secure Socket Layer was invented by long-gone Netscape in 1995. For two decades it served us adequately. But if you haven't heard, it is irrevocably broken and can't be repaired, thanks to the Poodle attack. SSL’s replacement, TLS (Transport Layer Security), is slightly better. Of all the doomed security tech discussed in this article, SSL is the closest to be being replaced, as it should no longer be used.

The problem? Hundreds of thousands of websites rely on or allow SSL. If you disable all SSL -- a common default in the latest versions of popular browsers -- all sorts of websites don't work. Or they will work, but only because the browser or application accepts "downleveling" to SSL. If it's not websites and browsers, then it's the millions of old SSH servers out there.

OpenSSH is seemingly constantly being hacked these days. While it’s true that about half of OpenSSH hacks have nothing to do with SSL, SSL vulnerabilities account for the other half. Millions of SSH/OpenSSH sites still use SSL even though they shouldn't.

Worse, terminology among tech pros is contributing to the problem, as nearly everyone in the computer security industry calls TLS digital certificates "SSL certs" though they don't use SSL. It's like calling a copy machine a Xerox when it's not that brand. If we’re going to hasten the world off SSL, we need to start calling TLS certs "TLS certs.

Make a vow today: Don't use SSL ever, and call Web server certs TLS certs. That's what they are or should be. The sooner we get rid of the word "SSL," the sooner it will be relegated to history's dustbin.

Doomed security technology No. 3: Public key encryption

This may surprise some people, but most of the public key encryption we use today -- RSA, Diffie-Hellman, and so on -- is predicted to be readable as soon as quantum computing and cryptography are figured out. Many, including this author, have been long (and incorrectly) predicting that usable quantum computing was mere years away. But when researchers finally get it working, most known public encryption ciphers, including the popular ones, will be readily broken. Spy agencies around the world have been saving encrypted secrets for years waiting for the big breakthrough -- or, if you believe some rumors, they already have solved the problem and are reading all our secrets.

Some crypto experts, like Bruce Schneier, have long been dubious about the promise of quantum cryptography. But even the critics can't dismiss the likelihood that, once it's figured out, any secret encrypted by RSA, Diffie-Hellman, and even ECC are immediately readable.

That's not to say there aren't quantum-resistant cipher algorithms. There are a few, including lattice-based cryptography and Supersingular Isogeny Key Exchange. But if your public cipher isn't one of those, you're out of luck if and when quantum computing becomes widespread.

Doomed security technology No. 4: IPsec
When enabled, IPsec allows all network traffic between two or more points to be cryptographically protected for packet integrity and privacy, aka encrypted. Invented in 1993 and made an open standard in 1995, IPsec is widely supported by hundreds of vendors and used on millions of enterprise computers.

Unlike most of the doomed security defenses discussed in this article, IPsec works and works great. But its problems are two-fold.

First, although widely used and deployed, it has never reached the critical mass necessary to keep it in use for much longer. Plus, IPsec is complex and isn't supported by all vendors. Worse, it can often be defeated by only one device in between the source and destination that does not support it -- such as a gateway or load balancer. At many companies, the number of computers that get IPsec exceptions is greater than the number of computers forced to use it.

IPsec's complexity also creates performance issues. When enabled, it can significantly slow down every connection using it, unless you deploy specialized IPsec-enabled hardware on both sides of the tunnel. Thus, high-volume transaction servers such as databases and most Web servers simply can’t afford to employ it. And those two types of servers are precisely where most important data resides. If you can't protect most data, what good is it?

Plus, despite being a "common" open standard, IPsec implementations don't typically work between vendors, another factor that has slowed down or prevented widespread adoption of IPsec.

But the death knell for IPsec is the ubiquity of HTTPS. When you have HTTPS enabled, you don't need IPsec. It's an either/or decision, and the world has spoken. HTTPS has won. As long as you have a valid TLS digital certificate and a compatible client, it works: no interoperability problems, low complexity. There is some performance impact, but it’s not noticeable to most users. The world is quickly becoming a default world of HTTPS. As that progresses, IPsec dies.

Doomed security technology No. 5: Firewalls

The ubiquity of HTTPS essentially spells the doom of the traditional firewall. I wrote about this in 2012, creating a mini-firestorm that won me invites to speak at conferences all over the world.

Some people would say I was wrong. Three years later, firewalls are still everywhere. True, but most aren't configured and almost all don't have the "least permissive, block-by-default" rules that make a firewall valuable in the first place. Most firewalls I come across have overly permissive rules. I often see "Allow All ANY ANY" rules, which essentially means the firewall is worse than useless. It's doing nothing but slowing down network connections.

Anyway you define a firewall, it must include some portion that allows only specific, predefined ports in order to be useful. As the world moves to HTTPS-only network connections, all firewalls will eventually have only a few rules -- HTTP/HTTPS and maybe DNS. Other protocols, such ads DNS, DHCP, and so on, will likely start using HTTPS-only too. In fact, I can't imagine a future that doesn't end up HTTPS-only. When that happens, what of the firewall?

The main protection firewalls offer is to secure against a remote attack on a vulnerable service. Remotely vulnerable services, usually exploited by one-touch, remotely exploitable buffer overflows, used to be among the most common attacks. Look at the Robert Morris Internet worm, Code Red, Blaster, and SQL Slammer. But when's the last time you heard of a global, fast-acting buffer overflow worm? Probably not since the early 2000s, and none of those were as bad as the worms from the 1980s and 1990s. Essentially, if you don't have an unpatched, vulnerable listening service, then you don't need a traditional firewall -- and right now you don't. Yep, you heard me right. You don't need a firewall.

Firewall vendors often write to tell me that their "advanced" firewall has features beyond the traditional firewall that makes theirs worth buying. Well, I've been waiting for more than two decades for "advanced firewalls" to save the day. It turns out they don't. If they perform "deep packet inspection" or signature scanning, it either slows down network traffic too much, is rife with false positives, or scans for only a small subset of attacks. Most "advanced" firewalls scan for a few dozen to a few hundred attacks. These days, more than 390,000 new malware programs are registered every day, not including all the hacker attacks that are indistinguishable from legitimate activity.

Even when firewalls do a perfect job at preventing what they say they prevent, they don't really work, given that they don't stop the two biggest malicious attacks most organizations face on a daily basis: unpatched software and social engineering.

Put it this way: Every customer and person I know currently running a firewall is as hacked as someone who doesn't. I don't fault firewalls. Perhaps they worked so well back in the day that hackers moved on to other sorts of attacks. For whatever reason, firewalls are nearly useless today and have been trending in that direction for more than a decade.

Doomed security technology No. 6: Antivirus scanners

Depending on whose statistics you believe, malware programs currently number in the tens to hundreds of millions -- an overwhelming fact that has rendered antivirus scanners nearly useless.

Not entirely useless, because they stop 80 to 99.9 percent of attacks against the average user. But the average user is exposed to hundreds of malicious programs every year; even with the best odds, the bad guy wins every once in a while. If you keep your PC free from malware for more than a year, you've done something special.

That isn’t to say we shouldn’t applaud antivirus vendors. They've done a tremendous job against astronomical odds. I can't think of any sector that has had to adjust to the kinds of overwhelming progressive numbers and advances in technology since the late 1980s, when there were only a few dozen viruses to detect.

But what will really kill antivirus scanners isn't this glut of malware. It's whitelisting. Right now the average computer will run any program you install. That's why malware is everywhere. But computer and operating system manufacturers are beginning to reset the "run anything" paradigm for the safety of their customers -- a movement that is antithetical to antivirus programs, which allow everything to run unimpeded except for programs that contain one of the more than 500 million known antivirus signatures. “Run by default, block by exception” is giving way to “block by default, allow by exception.”

Of course, computers have long had whitelisting programs, aka application control programs. I reviewed some of the more popular products back in 2009. The problem: Most people don't use whitelisting, even when it’s built in. The biggest roadblock? The fear of what users will do if they can't install everything they want willy-nilly or the big management headache of having to approve every program that can be run on a user’s system.

But malware and hackers are getting more pervasive and worse, and vendors are responding by enabling whitelisting by default. Apple's OS X introduced a near version of default whitelisting three years ago with Gatekeeper. iOS devices have had near-whitelisting for much longer in that they can run only approved applications from the App Store (unless the device is jailbroken). Some malicious programs have slipped by Apple, but the process has been incredibly successful at stopping the huge influx that normally follows popular OSes and programs.

Microsoft has long had a similar mechanism, through Software Restriction Policies and AppLocker, but an even stronger push is coming in Windows 10 with DeviceGuard. Microsoft’s Windows Store also offers the same protections as Apple's App Store. While Microsoft won't be enabling DeviceGuard or Windows Store-only applications by default, the features are there and are easier to use than before.

Once whitelisting becomes the default on most popular operating systems, it's game over for malware and, subsequently, for antivirus scanners. I can't say I'll miss either.

Doomed security technology No. 7: Antispam filters

Spam still makes up more than half of the Internet's email. You might not notice this anymore, thanks to antispam filters, which have reached levels of accuracy that antivirus vendors can only claim to deliver. Yet spammers keep spitting out billions of unwanted messages each day. In the end, only two things will ever stop them: universal, pervasive, high-assurance authentication and more cohesive international laws.

Spammers still exist mainly because we can't easily catch them. But as the Internet matures, pervasive anonymity will be replaced by pervasive high-assurance identities. At that point, when someone sends you a message claiming to have a bag of money to mail you, you will be assured they are who they say they are.

High-assurance identities can only be established when all users are required to adopt two-factor (or higher) authentication to verify their identity, followed by identity-assured computers and networks. Every cog in between the sender and the receiver will have a higher level of reliability. Part of that reliability will be provided by pervasive HTTPS (discussed above), but it will ultimately require additional mechanisms at every stage of authentication to assure that when I say I'm someone, I really am that someone.

Today, almost anyone can claim to be anyone else, and there's no universal way to verify that person's claim. This will change. Almost every other critical infrastructure we rely on -- transportation, power, and so on -- requires this assurance. The Internet may be the Wild West right now, but the increasingly essential nature of the Internet as infrastructure virtually ensures that it will eventually move in the direction of identity assurance.

Meanwhile, the international border problem that permeates nearly every online-criminal prosecution is likely to be resolved in the near future. Right now, many major countries do not accept evidence or warrants issued by other countries, which makes arresting spammers (and other malicious actors) nearly impossible. You can collect all the evidence you like, but if the attacker’s home country won't enforce the warrant, your case is toast.

As the Internet matures, however, countries that don't help ferret out the Internet's biggest criminals will be penalized. They may be placed on a blacklist. In fact, some already are. For example, many companies and websites reject all traffic originating from China, whether it's legitimate or not. Once we can identify criminals and their home countries beyond repudiation, as outlined above, those home countries will be forced to respond or suffer penalties.

The heyday of the spammers where most of their crap reached your inbox is already over. Pervasive identities and international law changes will close the coffin lid on spam -- and the security tech necessary to combat it.

Doomed security technology No. 8: Anti-DoS protections

Thankfully, the same pervasive identity protections mentioned above will be the death knell for denial-of-service (DoS) attacks and the technologies that have arisen to quell them.

These days, anyone can launch free Internet tools to overwhelm websites with billions of packets. Most operating systems have built-in anti-DoS attack protections, and more than a dozen vendors can protect your websites even when being hit by extraordinary amounts of bogus traffic. But the loss of pervasive anonymity will stop all malicious senders of DoS traffic. Once we can identify them, we can arrest them.

Think of it this way: Back in the 1920s there were a lot of rich and famous bank robbers. Banks finally beefed up their protection, and cops got better at identifying and arresting them. Robbers still hit banks, but they rarely get rich, and they almost always get caught, especially when they persist in robbing more banks. The same will happen to DoS senders. As soon as we can quickly identify them, the sooner they will disappear as the bothersome elements of society that they are.

Doomed security technology No. 9: Huge event logs

Computer security event monitoring and alerting is difficult. Every computer is easily capable of generating tens of thousands of events on its own each day. Collect them to a centralized logging database and pretty soon you're talking petabytes of needed storage. Today's event log management systems are often lauded for the vast size of their disk storage arrays.

The only problem: This sort of event logging doesn't work. When nearly every collected event packet is worthless and goes unread, and the cumulative effect of all the worthless unread events is a huge storage cost, something has to give. Soon enough admins will require application and operating system vendors to give them more signal and less noise, by passing along useful events without the mundane log clutter. In other words, event log vendors will soon be bragging about how little space they take rather than how much.

Doomed security technology No. 10: Anonymity tools (not to mention anonymity and privacy)

Lastly, any mistaken vestige of anonymity and privacy will be completely wiped away. We already really don't have it. The best book I can recommend on the subject is Bruce Schneier's "Data and Goliath." A quick read will scare you to death if you didn't already realize how little privacy and anonymity you truly have.

Even hackers who think that hiding on Tor and other "darknets" give them some semblance of anonymity must understand how quickly the cops are arresting people doing bad things on those networks. Anonymous kingpin after anonymous kingpin ends up being arrested, identified in court, and serving real jail sentences with real jail numbers attached to their real identity.

The truth is, anonymity tools don't work. Many companies, and certainly law enforcement, already know who you are. The only difference is that, in the future, everyone will know the score and stop pretending they are staying hidden and anonymous online.

I would love for a consumer's bill of rights guaranteeing privacy to be created and passed, but past experience teaches me that too many citizens are more than willing to give up their right to privacy in return for supposed protection. How do I know? Because it's already the standard everywhere but the Internet. You can bet the Internet is next.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com