Wednesday, 20 August 2014

Lenovo N20p Chromebook review: An affordable dual-mode device

Lenovo's latest Chromebook functions as both a regular laptop and a stand-supported tablet. But what is it like to use in the real world?

When you think of a Chromebook, you typically think of a keyboard-centric laptop -- but Lenovo's hoping to shake up that mindset with some versatile new devices.

The company has come out with a couple of convertible Chromebooks that can act as both traditional laptops and touchscreen tablets. The first, the Lenovo N20p Chromebook, costs $330 and offers a 300-degree tilting display. The second, the ThinkPad Yoga 11e Chromebook, costs $479 and features a higher-quality screen that bends back a full 360 degrees.

I've been living with the N20p model to start with, and one thing's for sure: It offers a Chrome OS experience like no other.
Body, design and that tilting display

At first glance, Lenovo's N20p Chromebook looks like any run-of-the-mill laptop: The computer has a matte-plastic gray casing with Lenovo's logo and the Google Chrome logo at its top. Open the lid and you're greeted by an 11.6-in. screen and a chiclet-style Chrome OS keyboard.

In that mode, the N20p Chromebook is pleasant enough to use: It's one of the higher-quality devices in its class, with sturdy construction, a commendable keyboard and a smooth-feeling and responsive trackpad. If you press on the center of the lid, you do feel a little give -- almost a slight springiness -- but by and large, the N20p seems well-built and less flimsy than some of the cheaper options in its price range.

The N20p is comfortable to hold on your lap, too: The laptop is 11.6 x 8.3 x 0.7 in. and 2.9 lbs. -- slightly heavier than some of the less sturdy devices of its size but still quite light and easy to carry.

As with other touch-enabled Chromebooks, you have the ability to tap, scroll or zoom the N20p's screen with your fingers, which I find to be a surprisingly useful feature. It's even more interesting, though, when you push the N20p's display back beyond the standard stopping point -- past the flattened-out 180-degree mark and all the way around to its fully tilted stand mode.

In that mode, you actually end up with the keyboard upside-down -- in other words, with keys facing downward -- serving as a base. The keyboard is automatically disabled in that state, so you don't have to worry about accidental key presses. Instead, what you get is a tablet-like experience, complete with a virtual on-screen keyboard that appears when you need it.

Coupled with the N20p's touch input, this setup works incredibly well. It opens up a whole new range of uses for the device while still leaving its traditional operations in place.

I've been using the N20p Chromebook in its laptop mode for work, for instance, then flipping the screen around and shifting into stand mode when I want to do something less input-oriented and more browsing-based -- catching up on articles I've opened throughout the day, scrolling through my social media streams or watching videos with the device resting comfortably on my lap.

It's reached the point where shifting between the system's two modes feels effortless and natural to me, and I've really grown to appreciate having that option. Chrome OS itself isn't entirely optimized for touch, so certain things are still a little awkward -- like trying to tap the small "x" to close a tab with your finger, for example -- but all in all, the touch-centric stand experience is quite pleasant. You just have to think of it as a complement to the traditional laptop environment rather than a replacement for it.

When the N20p is in its stand mode, the user interface does change a bit: All windows appear maximized, while a button shows up in the bottom-right area of the screen that allows you to switch between opened windows using a graphical interface. (Those already familiar with Chromebooks will note that it's the same task-switching command also present on the top row of the regular Chrome OS keyboard.)

The on-screen keyboard works well enough, too, though if you're typing anything more than a few words, you'll almost certainly want to flip the system back around into its laptop mode for easier text input. Given the choice on any device, I think a full-size physical keyboard is always going to be preferable for heavy-duty typing.

Because the screen can be adjusted to any position while the N20p is in its stand mode, you can flip the laptop into a tent-like arrangement if you want -- or even onto its side for a vertically oriented portrait view. I haven't found a need to use either of those orientations, but the possibilities are there if you want them.

As for the display itself, it's the same 1366 x 768 TN panel found in most lower-end Chromebooks these days -- but even within those parameters, it's one of the better screens I've seen. It's glossy, bright and less grainy than the displays on many similarly priced systems. Viewing angles aren't great and it's no match for a higher-quality IPS display, but I've been able to use it for full days without being annoyed or feeling any significant eyestrain.

On the left edge of its frame, Lenovo's N20p Chromebook has a proprietary charging port along with a USB 3.0 port, a dedicated HDMI-out port and a 3.5mm headphone jack. The laptop's right edge, meanwhile, holds a USB 2.0 port and a physical power button -- something slightly different from most Chromebooks, where the power button exists on the keyboard.

The N20p Chromebook has two speakers on either side of its bottom surface. The speakers are pretty decent, with loud, clear and full-sounding audio. They're not the best you'll ever hear, but for this class of device, they're actually quite impressive.

Performance

So far so good, right? Unfortunately, there is one asterisk with Lenovo's N20p Chromebook -- and it's on the subject of performance.

The N20p Chromebook uses one of Intel's new Bay Trail processors -- the Intel Celeron N2830 -- along with 2GB of RAM. In real-world use, it feels like a meaningful step backward from the level of performance I've grown accustomed to seeing with the recent crop of Chrome OS devices, most of which are powered by Intel's speedier Haswell-based chips.

To see the difference between two Chrome OS devices that use Intel processors, I compared the N20p to an Asus Chromebox, with a Haswell-based Cerelon 2955U processor and 2GB of RAM. The N20p Chromebook was consistently slower at loading pages -- by as much as two to six seconds, depending on the site -- and just seemed significantly less zippy overall.

Cons: Low-resolution display with low-quality TN panel; performance not as good as that of other Chromebooks in its class

In fact, even without a side-by-side comparison, the N20p just doesn't feel terribly snappy. I noticed its limitations the most in situations where I had several browser tabs running; there, the device really seemed to struggle and reach levels of sluggishness I haven't experienced on Chrome OS in quite some time.

All things considered, I'd say this: If you're like most people and tend to keep only one or two tabs open at a time, the N20p should be fine for your needs. It's still a noticeable step down from the level of performance you'd get from other similarly priced or even less expensive systems -- which is disappointing, to say the least -- but for basic levels of use, it's acceptable enough and may be a worthwhile tradeoff for all of the device's positives. If you do any resource-intensive multitasking, however, you're going to find yourself frustrated by the relatively low performance ceiling.

Lenovo does offer a model of the N20p Chromebook with a slightly higher-end Bay Trail processor, the Intel Celeron N2930; that model is sold only via Lenovo's website and costs $20 more than the regular base model. While I haven't had an opportunity to test it firsthand, the promise of enhanced performance seems to make the extra $20 a worthwhile investment.

The N20p does do reasonably well in terms of battery life: The laptop is listed for eight hours of use per charge, which is pretty much in line with what I've gotten. As for storage, the device comes with 16GB of onboard space along with the option to expand with your own SD card.
Bottom line

Lenovo's N20p Chromebook offers a compelling experience that goes beyond what the typical Chromebook provides. The tilting display really is a nice touch that expands the device's potential and opens it up to new and interesting types of uses.

The system is held back, however, by lower than average performance -- something we'll probably be seeing more of as Intel's Bay Trail chips make their way into more Chrome OS devices. That's a factor you'll have to closely consider in determining whether the N20p Chromebook is right for you.

The N20p Chromebook is a standout device with lots of attractive qualities. For folks in the power-user camp, it's just a shame it's not available with the more robust internals that other similarly priced products provide.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Saturday, 2 August 2014

In search of a social site that doesn't lie

Facebook and OKCupid experiment on users. So what's wrong with that?

Rudder's post described a few of the experiments that the dating website had carried out. In one, OKCupid told people that they would be good matches with certain other people even though the site's algorithms had determined that they would be bad matches. That's right: The company deliberately lied to its users. OKCupid wanted to see if people liked each other because they have the capacity to make up their own minds about who they like, or if they like each other because OKCupid tells them they should like each other.

(The controversial post was Rudder's first in several years; he had taken time off to write a book about experimenting on people. Due out next month, the book is called Dataclysm: Who We Are (When We Think No One's Looking).)

The OKCupid post was in part a response to controversy over a recently discovered Facebook experiment, the results of which were published in an academic journal. Facebook wanted to see if people would post more negative posts if their own News Feeds had more negative posts from their friends. In the experiment, Facebook removed some posts by family and friends because they were positive. The experiment involved deliberately making people sadder by censoring friends' more uplifting and positive posts.

Don't like this kind of manipulation? Here's Rudder's response: "Guess what, everybody: if you use the Internet, you're the subject of hundreds of experiments at any given time, on every site.

That's how websites work."


What's wrong here

Rudder's "everyone is doing it" rationalization for experimenting on users makes it clear that he doesn't understand the difference between what OKCupid and Facebook are doing, and what other sites that conduct A/B tests of different options are doing.

The difference is that OKCupid and Facebook are potentially changing, damaging or affecting the real relationships of real people. They are manipulating the happiness of people on purpose.

These companies might argue that this damage to the mood and relationships of people is small to the point of being inconsequential. But what makes them think it's OK to deliberately do any damage at all?

The other glaring problem with these social science experiments is that the subjects don't know they're participating.

Yes, I'm sure company lawyers can argue in court that the Terms of Service that everyone agreed to (but almost nobody read) gives OKCupid and Facebook the right to do everything they do. And I'm sure the sites believe that they're working so hard and investing so much to provide free services that users owe them big time, and that makes it all OK.

Imagine a splash screen that pops up each month on these sites that says: "Hi. Just wanted to make sure you're aware that we do experiments on people, and we might do experiments on you. We might lie to you, meddle in your relationships and make you feel bad, just to see what you'll do."

No, you can't imagine it. The reason is that the business models of sites like OKCupid and Facebook are based on the assumption of user ignorance.
Why OKCupid and Facebook think it's OK to mess with people's relationships

The OKCupid admission and the revelations about the Facebook research were shocking to the public because we weren't aware of the evolving mindset behind social websites. No doubt the OKCupid people and the Facebook people arrived at their coldly cynical view of users as lab rats via a long, evolutionary slippery slope.

Let's imagine the process with Facebook. Zuckerberg drops out of Harvard, moves to Silicon Valley, gets funded and starts building Facebook into a social network. Zuck and the guys want to make Facebook super appealing, but they notice a disconnect in human reason, a bias that is leading heavy Facebook users to be unhappy.

You see, people want to follow and share and post a lot, and Facebook wants users to be active. But when everybody posts a lot, the incoming streams are overwhelming, and that makes Facebook users unhappy. What to do?

The solution is to use software algorithms to selectively choose which posts to let through and which to hold back. But what criteria do you use?

Facebook's current algorithm, which is no longer called Edgerank (I guess if you get rid of the name, people won't talk about it), is the product of thousands of social experiments -- testing and tweaking and checking and refining until everyone is happy.

The result of those experiments is that Facebook changes your relationships. For example, let's say you follow 20 friends from high school. You feel confident that by following them -- and by them following you -- that you have a reliable social connection to these people that replaces phone calls, emails and other forms of communication.

Let's say you have a good friend named Brian who doesn't post a lot of personal stuff. And you have another friend, Sophia, who is someone you don't care about but who is very active and posts funny stuff every day. After a period of several months during which you barely interact with Brian but occasionally like and comment on Sophia's posts, Facebook decides to cut Brian's posts out of your News Feed while maintaining the steady stream of Sophia posts. Facebook boldly ends your relationship with Brian, someone you care about. When Brian posts an emotional item about the birth of his child, you don't see it because Facebook has eliminated your connection to Brian.

And don't get me started on OKCupid's algorithms and how they could affect the outcome of people's lives.

Not only do both companies experiment all the time; their experiments make huge changes to users' relationships.

The real danger with these experiments
You might think that the real problem is that social networks that lie to people, manipulate their relationships and regularly perform experiments on their users are succeeding. For example, when Facebook issued its financial report last month, it said revenue rose 61% to $2.91 billion, up from $1.81 billion in the same quarter a year ago. The company's stock soared after the report came out.

Twitter, which is currently a straightforward, honest, nonmanipulative social network, has apparently seen the error of its ways and is seriously considering the Facebook path to financial success. Twitter CEO Dick Costolo said in an interview this week that he "wouldn't rule out any kind of experiment we might be running there around algorithmically curated experiences or otherwise."

No, the real problem is that OKCupid and Facebook may take action based on the results of their research. In both cases, the companies say they're experimenting in order to improve their service.

In the case of OKCupid, the company found that connecting people who are incompatible ends up working out better than it thought. So based on that result, in the future it may match up more people it has identified as incompatible.

In the case of Facebook, it did find that mood is contagious. So maybe it will "improve" Facebook in the future to build in a bias for positive, happy posts in order to make users happier with Facebook than they are with networks that don't filter based on positivity.

What's the solution?

While Twitter may follow Facebook down the rabbit hole of user manipulation, there is a category of "social network" where what you see is what you get -- namely, messaging apps.

When you send a message via, say, WhatsApp or Snapchat or any of the dozens of new apps that have emerged recently, the other person gets it. WhatsApp and Snapchat don't have algorithms that choose to not deliver most of your messages. They don't try to make you happy or sad or connect you with incompatible people to see what happens. They just deliver your communication.

I suspect that's one of the reasons younger users are increasingly embracing these alternatives to the big social networks. They're straightforward and honest and do what they appear to do, rather than manipulating everything behind the scenes.

Still, I'd love to see at least one major social site embrace honesty and respect for users as a core principle. That would mean no lying to users, no doing experiments on them without their clear knowledge, and delivering by default all of the posts of the people they follow.

In other words, I'd love to see the founders of social sites write blog posts that brag: "We DON'T experiment on human beings."

Wouldn't that be nice?

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Sunday, 29 June 2014

Network World's 2014 State of the Network survey

Aligning IT with the business has been a top priority of IT organizations for the past few years, but that is changing, according to the latest State of the Network Survey. IT has apparently made enough headway on the alignment issue that other priorities are coming to the fore. The No. 1 business objective of the 282 IT respondents is decreasing operational costs, while the top technology objective is lowering IT operational costs through server consolidation and overall IT simplification. Continue for more survey results.

When asked about the benefits of SDN, network flexibility is by far the most anticipated benefit, followed by simplified network operation and management. Reducing CAPEX and OPEX are far down on the list, which means IT might have a hard time convincing the CEO and CFO to take the plunge into the world of SDN if there’s no clear financial benefit.

So, where are people deploying SDN? According to our survey, the most popular place for SDN pilot projects is the data center (14%), followed by enterprise/WAN (10%). And a few brave souls (6%) are tackling both. But a full 50% of respondents are still sitting on the sidelines.

The data center is expected to be the biggest beneficiary of SDN technology, according to respondents, followed by enterprise/WAN. Only 10% of respondents plan to take on SDN deployments in both the data center and throughout the enterprise/WAN. And a full 33% of respondents said that SDN is not on their radar at all.

When it comes to thought leadership in the emerging field of SDN, a full 52% of respondents said they weren’t sure, which means there’s plenty of opportunity for an established vendor or an upstart newcomer to grab the attention of enterprise IT buyers. In the meantime, the usual suspects are at the top of the list, with Cisco at 22%, Juniper at 12%, HP at 11% and Nicira/VMware with a combined 14%.

When it comes to security related challenges, clearly IT execs are facing a number of new problems, with advanced persistent threats high on the list, following by mobile/BYOD, and cloud security. But surprisingly the No. 1 challenge was end users. Respondents said getting awareness and cooperation from end users was their biggest headache.

Productivity-related challenges fell into the very traditional categories, with money being far and away the top impediment to increased IT productivity, according to respondents. Traditional concerns like security, privacy and finding the right talent were at the top of the list. At the bottom on the list are two seemingly hot technologies – video and social media. But it seems that enterprise IT has bigger fish to fry.

Protecting the network/data center against data breaches and data leaks is Job One, according to respondents. Traditional IT metrics like uptime and optimizing end-to-end performance were high on the list. Interestingly, respondents put cloud-related projects lower down on their priority lists.

Protecting the network/data center against data breaches and data leaks is Job One, according to respondents. Traditional IT metrics like uptime and optimizing end-to-end performance were high on the list. Interestingly, respondents put cloud-related projects lower down on their priority lists.

Bad news for Satya Nadella: Nearly half of respondents say a migration to Windows 8 isn’t even on their radar. Only 7% of enterprise IT respondents have migrated to Microsoft’s latest OS, while only 10% are in the pilot stage.

Cloud services are certainly gaining in popularity, but among our respondents, enthusiasm for Infrastructure-as-a-Service is pretty tepid. Only 15% of respondents are using IaaS, with another 7% piloting and 10% researching. However, 45% of respondents don’t have IaaS on their radar.

IT execs in our survey are making good progress when it comes to implementing a BYOD policy. Already, 18% have rolled out a BYOD policy, with another 18% in the pilot stage. Only 30% of respondents are ignoring the need for a formal BYOD policy.

Our respondents were gung-ho when it comes to server consolidation: a full 44% have already implemented this cost saving measure, while 9% were in the pilot stage, 14% were researching and another 13% had server consolidation on their radar.

Our respondents were gung-ho when it comes to server consolidation: a full 44% have already implemented this cost saving measure, while 9% were in the pilot stage, 14% were researching and another 13% had server consolidation on their radar.

The move toward flattening the data center – moving from a traditional three-tier, spanning-tree architecture to something more streamlined and efficient – appears to be going strong. Eighteen percent of respondents have already achieved some level of data center network flattening, while 17% are in the research phase and 9% are actively piloting.

The move toward flattening the data center – moving from a traditional three-tier, spanning-tree architecture to something more streamlined and efficient – appears to be going strong. Eighteen percent of respondents have already achieved some level of data center network flattening, while 17% are in the research phase and 9% are actively piloting.

WAN optimization is a proven money saver for enterprise IT. And adoption of this technology appears to be on the rise, with 16% of respondents having achieved some level of WAN optimization, another 18% in the pilot phase and 17% researching the technology.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Monday, 16 June 2014

Three best practices for reducing the risk of SQL injection attacks

This column is available in a weekly newsletter called IT Best Practices. Click here to subscribe.

SQL injection attacks have been around for more than 10 years. Database security experts know they are a serious problem. Now a recently unsealed Second Superseding Indictment against a notorious group of Russian and Ukrainian hackers shows just how damaging this type of attack can be.

The indictment provides a long list of companies that have suffered costly data breaches where the root cause has proven to be a SQL injection. According to the indictment:

Beginning on or around Dec. 26, 2007, Heartland Payment Systems was the victim of a SQL injection attack that resulted in malware being placed on its payment processing system and the theft of more than 130 million card numbers and losses of approximately $200 million.
In or about early November 2007, a related company of Hannaford Brothers Co. was the victim of a SQL injection attack that resulted in the later placement of malware on Hannaford's network, the theft of approximately 4.2 million card numbers.
Between January 2011 and March 2012, Global Payment Systems was the victim of SQL injection attacks that resulted in malware being placed on its payment processing system, the theft of more than 950,000 card numbers, and losses of approximately $92.7 million.
In or around May 2007,NASDAQ was the victim of a SQL injection attack that resulted in the placement of malware on its network and the theft of login credentials.

I think you are beginning to see the pattern here. Other companies cited in this indictment as victims of attacks include 7-Eleven, JC Penney, Carrefour S.A., Wet Seal, Commidea, Dexia Bank Belgium, JetBlue Airways, Dow Jones, Euronet, Visa Jordan Card Services, Diners Club International, lngenicard US and an unnamed bank.
MORE ON NETWORK WORLD: Free security tools you should try

The indictment goes on to say that “conservatively, the defendants and their co-conspirators unlawfully acquired over 160 million card numbers through their hacking activities. As a result of this conduct, financial institutions, credit card companies, and consumers suffered hundreds of millions in losses, including losses in excess of $300 million by just three of the corporate victims, and immeasurable losses to the identity theft victims due to the costs associated with stolen identities and fraudulent charges.”

These particular breaches occurred in 2007. Think how many additional breaches, large and small, have occurred since then.

I think it will come to light that the recent Target, Neiman Marcus and Michaels breaches also might stem from SQL injection attacks of some sort. Though it hasn’t been made public, security experts are already saying that the Target breach used SQL injection to install malware on the point-of-sale systems where the attackers were then able to collect the card numbers out of memory. Many people don’t realize that SQL can be bidirectional. It can be used to drain the database but it also can be used to modify and upload to a database. An attacker can use SQL injection to upload the malware into the database system and then have that system send out the malware to all the POS endpoints.

Structured Query Language is flawed because of the way it was architected. It can be fooled into trying to interpret data as an instruction. On the other hand, there’s a lot of capability in SQL that makes it attractive to developers, especially for web applications.

Since the consequences of SQL injection attacks can be so damaging, I asked Michael Sabo of DB Networks about best practices that companies can follow in order to reduce their risk of this threat. Sabo says there’s no silver bullet, but he does have some advice.

“Often you will hear, ‘if you just do this, or just do that, the problem will go away’,” says Sabo. “But it’s not that simple. Any individual countermeasure can go a long way but it is not going to close the threat. It doesn’t work that way.”

He says that one popular countermeasure that is promoted by the Open Web Application Security Project (OWASP) is to write perfect code. “Even if I write perfect application code, I can still be vulnerable because the vulnerabilities come in through third-party software that I had nothing to do with,” says Sabo. “Look at Ruby on Rails. Who knew that the underlying framework was vulnerable? It affected 250,000 websites with a SQL injection vulnerability because those developers built their websites on top of the vulnerable framework.”

Sabo says there are instances in which they have found vulnerabilities in the relational database management system itself. “Oracle has had SQL injection vulnerabilities in the RDMS itself, so regardless of how good I write my application code, I can still be vulnerable,” he says.

Short of having perfect code, there are three critical things companies can do to reduce the risk of experiencing a SQL injection attack.

The first is to conduct an inventory of what you have as far as databases go, and understand their connections to applications. “Many companies are completely unaware of some of the databases in their environment,” says Sabo. “And even if they know about all their databases, often what happens is the database is being exposed on network segments that it’s not supposed to be exposed on. This is not a database problem per se, but a networking problem.”

For example, Sabo says a company might bring up a database in a test environment and then forget to close it down at the end of testing. Often that database might have default passwords, and sometimes it has real data. Developers do this sort of thing because they want to stress test the application and they use real rather than fake data because they think no one will ever see it.

Then there is the mapping issue. What applications are mapped to the database, and are they the correct ones? “Maybe for a test, a production database was connected up to a test database for a short while and then the connection was left by accident. Or a production database is mapped to an application that was retired, or that no one knows about. These things happen,” says Sabo. “So our first best practice is to provide visibility and an inventory into what databases you have and what they are mapped to.”

The next step is to continuously monitor what is going on between your application and the database. This is actually a recommendation from NIST. You will want to know if there is any rogue traffic going on there. This is where you look for SQL injections because you see the real SQL going across. There are tools that continuously monitor this traffic and detect if there is an unauthorized attempt at modifying data or getting data out.

And finally, the last best practice is to protect the database network with data loss prevention tools. “If you start to see credit card information coming out over the network and you know it shouldn’t be coming out that way, you know there is a problem,” says Sabo.

If your organization has some serious data to protect, and you know how common SQL injection attacks are, then it may benefit you to put these recommendations into practice.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Friday, 30 May 2014

The Top 11 Cities for Technology Careers

Is your particular skill-set saturated in your current location? Are you thinking about taking your tech talents to a new locale, possibly one with more IT career opportunities? If so, here are some cities you may want to consider before choosing a new home..

The Cities that Offer IT Professionals the Most Bang for the Buck
What makes a city one of the best in regards to technology careers? A couple key factors quickly bubble to the top including: how much does the average IT worker make in that city; how many opportunities are there; and what is the cost of living? On first glance, that great tech job out in LA or NYC might pay you more than your current position, but after you factor in the cost of living, it quickly becomes obvious that all things are not equal.

So come along for this country-wide tour of the U.S. as CIO.com countdowns what, according to Dice data, are the best U.S. cities for IT jobs.

How We Got Here
Who doesn't want more bang for their buck? To some where they live isn't a major factor in their job hunt, and if there are more opportunities somewhere else, why not head there? For those individuals CIO.com worked with Dice, a major career site, to gather data on the best U.S. cities in regards to technology jobs and average IT salaries. We then cross referenced that data with cost of living index data provided by Dean Frutinger, Project Manager at the Center for Regional Economic Competitiveness. By weighing these three factors together, IT professionals can more accurately assess whether it's time for a change in location.

1. Austin Texas
The capital of Texas, Austin, comes in at number one on our list. While it may not have the gravitas of NYC or Silicon Valley, there is a lot to be said for this growing technology hub. This bike friendly metropolis sports over eighty miles of bike paths, and it's rapidly expanding market offers great opportunities with a low cost of living. Austin regularly shows up on the list of best cities for tech startups.

Professional Sports Teams: The Austin area isn't known for its sports cache but there are NBA, NHL and MBA farm teams here along with several universities.

2. Houston, Texas
The Houston area continues to grow. "Oil and Gas, of course, ranked the highest in terms of needing highly skilled and educated employees, but let's not lose sight of medical, technology, engineering, retail, education and financial fields-all large sectors in West Houston. There is a scramble for highly educated and skilled employees," says Jeannie Bollinger, President and CEO of the Houston West Chamber of Commerce. According to Dice numbers Texas actually ranked as the fourth fastest growing technology employment state in the U.S., which explains how three of The Lone Star states cities made it onto this list.

Professional Sports Teams: The Texans, the Astros, the Rockets, the Dynamo, and the Dash.

3. Atlanta, Georgia
A low cost of living and high average IT salaries are making life in 'the ATL' sweet. 'The Big Peach' sports good universities, a major airport hub and a thriving music scene.

According to reports from Dice, mobile is driving the job growth in this region. The metro area of Atlanta recently saw a 66 percent increase in the amount of online job postings. This was across all industries but tech accounts for the lion's share of the jobs created.

Professional Sports Teams: The Falcons, the Braves, the Hawks, the Silverbacks (MSL).

4. Phoenix, Arizona
If living in the dry desert weather appeals to you then our next city could be your next home. Phoenix, also known as 'The Valley of the Sun,' is a little gem in the southwest that has managed to struggle back from a crushing foreclosure crisis. Recently ranked as the fourth fastest growing city for technology job postings and fourth fastest in the country in regards to tech salary growth, Phoenix has had average tech salaries increase by an impressive 12 percent year over year.

Professional Sports Teams: The Cardinals, the Diamondbacks, the Rattlers, Coyotes, Mercury (WNBA) and the Suns.

5. Denver, Colorado
Are you a fan of winter? Well then the city of Denver in the snowy mountains of Colorado should definitely be in consideration. Colorado has several tech biggies in their backyard including IBM, Oracle, Lockheed Martin and Avaya, as well as a thriving tech startup scene.

More than 53 percent of Downtown Denver's population holds a bachelor's degree, making it one of the smartest cities in America. It ranks as one of the top relocation destinations among highly-skilled workers between 25 and 44. The relatively low cost of living compared to Silicon Valley or NYC combined with the slower pace of living attracts many tech workers fleeing the coasts.

Professional Sports Teams: The Broncos, the Rockies, the Nuggets, the...

6. Dallas, Texas
The Dallas-Fort Worth area is a sprawling city situated in the plains of Texas. It ranked as the number three spot for total number of high-tech jobs last year. This region employs over 136,000 tech workers and has had 6.6 percent annual growth, making Dallas an emerging market and one to watch. Everything is bigger in Texas and the area is known for its big steaks, BBQ, Mexican and Tex-Mex foods. Combine all of this with a great set of sports teams, a high standard of living and low cost of living and it becomes easy to see why your next job may be deep in the heart of Texas.

Professional Sports Teams: The Cowboys, the Rangers, the Mavericks, the Stars and FC Dallas.

7. Charlotte, North Carolina
Another state managing to grab multiple positions on our list is North Carolina. Charlotte itself comes in at number four on the list of cities offering the most bang for the buck. Beautiful weather, low cost of living, proximity to the tri-state area and a growing market all come together to make Charlotte a consideration for those looking to leave the fevered pace of areas such as NYC, New Jersey and Baltimore behind.

Professional Sports Teams: The Panthers, NASCAR and the Bobcats.

8. Raleigh, North Carolina
Recently Forbes named Raleigh number two on its fastest growing cities list. Cisco, IBM, GlaxoSmithKline and many others have set up shop here to take advantage of the economic benefits as well as the talent pool.

Close proximity to several universities make this town one of the more highly-educated populations. In January it was declared that "The (research) triangle has the nation's most 'educated' center cities." In fact Raleigh regularly rates as one of the best places to live in the U.S. Great weather, reasonable housing prices and lots of jobs make Raleigh an attractive place to call home.

9. Chicago, Illinois
Sweet Home Chicago. Whether you're renting a building for your new startup or looking for your new home, prices here are rock-bottom when compared to Silicon Valley, San Francisco, New York City or LA.

The city was designed in such a way that most places are in walking distance to several parks. Not many people think beaches when they think Chicago, but it is actually home to 29 miles of beaches located on Lake Michigan. Combine all this with no personal income taxes and a sports team in virtually every professional sport, suddenly Chicago as a destination for tech pros starts to make a lot of sense.

Professional Sports Teams: The White Sox, the Cubs, the Blackhawks, the Bulls, the Fire, Da Bears, the Wolves and the...

10. Portland, Oregon
We start our list in Portland, Oregon or 'The City of Roses' as it's known due to its perfect rose-growing climate. Regularly noted as a bike-friendly and green city, Portland has much to offer those that love the outdoors. Other reasons to consider Portland your home include a robust music scene and a strong job market among other things. "In my opinion, if you are a software engineer graduating from college right now, there is no better city you can move to than Portland," says Sam Blackman, CEO and Co-founder of Elemental, a Portland tech startup.

Professional Sports Teams: The Trail Blazers, the Thorns and the Timber.


Sunday, 25 May 2014

How far are you willing to go to spy on your employees' smartphones?

mSpy monitoring service/app tracks lots of data, but is it too snoopy?

The scoop: Mspy mobile phone monitoring service/app, starting at $40 per month (as tested, features would cost $70 per month)
mspy 620

What is it? The ultimate eavesdropping solution for people who want to see what their employees, kids or spouse are doing on their Android (or jailbroken iPhone) smartphone. The service can track what phone numbers are being called, the recipient and contents of text messages, what photos, videos and audio recordings they’re taking, what web sites they’re visiting, and emails they’re making. You can also block the smartphone from visiting specific web sites,block specific applications, monitor other apps (Skype, WhatsApp, Facebook and Viber).

Why it’s cool: The vast amount of things that the app/service can monitor is quite impressive, if not totally complete (for example, you can’t see any incoming MMS message, so the off-color photo your daughter receives from the boyfriend won’t be detected). Features that the service offers — including device wipe, app/site blocking and incoming phone call blocking — are usually only seen on enterprise-level mobile device management (MDM) products/services. Seeing a service like this target consumers and (more likely), small-to-midsize businesses is an interesting trend.

Here’s a video that mSpy produced touting its service:

Some caveats: We had difficulty hearing our recorded phone calls (all we got was static rather than a recording); the location tracker seemed to utilize the cell phone towers for location, not the device’s GPS function (it took some time for the system to discover where the phone was located). The folks at mSpy said the likely culprit was an older version of the software on our test Android phone - but instead of an over-the-Internet firmware/app update, they said they’d have to update the phone in person (a paying user would likely have to physically update the app on the phone as well).

The bigger issue/problem for users is whether you want or need this amount of monitoring of your mobile devices. This is major spying / monitoring territory that you’re entering here - being able to see exactly what the smartphone user is doing with their phone. Whether it’s your employee, your child or your spouse/partner, the issue of trust comes up with software like this. Even though mSpy says on its site that “My Spy (mSpy) is designed for monitoring your employees or underage children on a smartphone or mobile device that you own or have proper consent to monitor,” and “You are required to notify users of the device that they are being monitored,” there’s a big chance that the user will forget about this at some point, and the boss/parent/spouse/partner will end up seeing something that they might not want to see. It’s a level of privacy invasion that I’m not comfortable doing with my wife and kids (maybe I’ll feel different when my kids get older), and I’d have doubts about having IT staff doing this with employees. If you have any doubts about what the app/service can do or is aimed at, type in mSpy in YouTube search and see that the second video is called “How I caught my boyfriend cheating using mSpy”.

The second issue is the cost. At $40 per month (the starting level, the features we tested would cost $70 per month), this service is cost prohibitive for a large majority of consumers, as it approaches (or even exceeds) the cost of a monthly phone service plan. However, mSpy does offer a 10-day refund policy, so maybe you can use the service for nine days to see what your spouse/child/employee is doing, and then cancel the service.


Saturday, 10 May 2014

Microsoft XP is in the queue of erasing

Microsoft has ended up encouraging users to stop using windows XP for very long.

Microsoft's choice to remove its support team in the sand has sowed uncertainty and will likely encourage bad manners by several clients, analysts said at present.

"If next month someone finds another zero-day like this one, Microsoft might just shift the line once more," said John Pescatore, director of emerging security trends at the SANS Institute, a security training company.

"In a method, this encourages awful manners. There's a risk that people will look at it that way," said Michael Silver, an analyst with Gartner, referring to those who will now question Microsoft's determination to end XP maintain, and thus slow or even suspend their resettlement to newer editions of Windows.

The specialist were discussion about Microsoft's shift on May 1 to problem fixes for a serious susceptibility in Internet Explorer (IE) that had been disclosed the week before and used by cyber criminals for an anonymous span of time before that to take control Windows PCs. Patching the bug was not strange; what was out of the normal was Microsoft's choice to push the join to Windows XP equipment.

At First, Microsoft had set the finish of support for Windows XP as April 8, a date it had broadcast for years. When Microsoft software reaches its support departure time, it's our business policy to stop public patching.

Just days after the limit, Microsoft fundamentally said, "Never mind," and patched the IE helplessness on Windows XP. What had been sure -- the support line in the sand -- became irresolute?

Microsoft stand-by the decision, proverb it had bent to what it called "overblown" media exposure and explanation that it did so only because XP had only newly been retired.

"I don't think the coverage was overblown," said Pescatore.

Wes Miller, an analyst with commands on Microsoft, decided. "It was a extremely bad weakness," he keen out.

Even so, the analysts were surprised at the let go of a fix for XP, not only because of the line Microsoft had so firmly drawn but because of the ramifications of erasing that line.

The precedent was what worried the experts. "totally, the standard matters to Microsoft," said Miller. "It's not a question of if, but when, this issue will come up yet again. Until key organizations are off of XP, every major vulnerability becomes a important chance for exploitation."

Some consumers still having Windows XP may view Microsoft's patching decision as a pass to carry on organization the 13-year-old operating system which, as Microsoft has repeatedly hammered home, lacks many of the higher security and anti-exploit features and technologies in newer editions, including Windows 7 and Windows 8.1.

Even further in the future, customers running Windows 7 may recall this XP patch and conclude that Microsoft is not serious about retiring that OS when its January 2020 support deadline nears.

"There is now a difference between what Microsoft thinks they mean and what [customers] think they mean," said Miller. "Everyone is playing chicken. Which means [years from now] people may say, 'I can keep running Windows 7.'"

Microsoft was in a "lose-lose" situation with XP, according to Silver, because of the operating system's large user base. At the end of April, XP powered about 26% of the world's personal computers, analytics company Net Applications revealed last week.

Although Microsoft didn't talk about XP's stubborn confrontation to retirement, and the huge numbers of PCs that still run the OS, the decision was clearly based on its continued prominence. Which makes one wonder, analysts said, what Microsoft may do in the weeks and months to come.

"May be Microsoft thought hard about this one. But if the same thing happened in a year, you wouldn't see it. So that [patch last week] may have been the real line," contended Silver.

"6 months from now, an XP vulnerability may get the same [media] coverage," said Pescatore. "But then Microsoft has a much stronger legend. They might say, 'XP's dropped in half since April, so we're sticking to the plan.'"


Best Microsoft MCTS Certification, Microsoft MCP Training at certkingdom.com