Thursday, 29 August 2013

Juniper kills MobileNext mobile packet product line

Juniper MobileNext was a high-profile competitor to Cisco's Starent gateway that was designed to enable non-interrupted delivery of high-definition voice and video over 2G/3G and LTE mobile networks

Juniper has killed a high-profile product for the core of mobile operator networks after combining business units to focus on potential growth opportunities.

Juniper has exterminated or what it calls end-of-lifed (EOL) its MobileNext mobile packet core product line, software introduced in 2009 as part of “Project Falcon” for its MX edge routers that was designed to enable non-interrupted delivery of high-definition voice and video to users over 2G/3G and LTE mobile networks. MobileNext was launched at Mobile World Congress in early 2011 to allow Juniper’s MX 3D to function as a broadband gateway, an authentication and management control plane for 2G/3G and LTE mobile packet cores, and as a policy manager for subscriber management systems.

MobileNext was intended to compete with Cisco’s ASR 5000 LTE gateway, obtained from its acquisition of Starent. But the product was struggling to gain traction in the market and was one of a handful of new Juniper products straining company financials as they went through lengthy evaluation cycles with potential customers.

Juniper is killing the entire MobileNext offering, which consists of three products: the Mobile Broadband Gateway; the Mobile Control Gateway; and the Mobile Policy Manager. The company claims, however, that its mobility strategy for the operator core remains intact.
“We have made the decision to end-of-life the MobileNext solution,” a Juniper spokesperson says. “However, our strategy remains unchanged: to virtualize mobile networks and deliver innovation through our existing portfolio of backhaul, security, routing and edge services with products such as the MX Series 3D Universal Edge Routers, SRX Series Services Gateways and JunosV App Engine software virtualization platform. We will continue to work with our partners to deliver best-in-class solutions that help customers improve network economics and accelerate delivery of new mobile services.”

Juniper will now address mobile packet core requirements through software-defined network (SDN) and network functions virtualization (NFV) capabilities, according to an internal memo authored by Daniel Hua, senior vice president of Juniper’s Routing Business Unit, and obtained by Network World.

“Despite our decision to EOL MobileNext we remain committed to executing on all existing commitments to our customers and to the mobility space longer term. We believe we can meet the needs of our customers by providing the underlying virtualized mobile infrastructure (routing, switching, SDN and NFV to enable customers to make this transition as well as offer specific virtualized network functions.”

Indeed, Juniper earlier this year announced a virtualized, SDN version of the Mobile Control Gateway based on the JunosV App Engine, which is shipping now on the MX router.

MobileNext’s demise comes as Juniper merges its Edge Services Business Unit into its Routing Business Unit. Hua explains the rationale for this in his memo:

“The compelling reason driving this organization alignment is to increase synergy and focus under the umbrella of a single routing business unit. We believe this step will ensure close alignment of our embedded and virtual services with our market-leading MX and PTX platforms. Many of the network edge services were originally developed as extensions of the Junos OS within RBU. We are realigning these services back to its original function allowing us to strengthen and further innovate in the areas of our Access, Edge, and Core offerings through tighter integration of network services.”

Sources say Juniper is also scaling down development of its Junos Content video and media delivery product line, formerly known as Media Flow and obtained from the $100 million acquisition of Ankeena Networks in 2010. Junos Content is designed to optimize mobile and fixed networks for efficient video and media delivery to smartphones and other mobile devices.


Best Microsoft MCTS Certification,
Microsoft MCITP Training at certkingdom.com

Wednesday, 21 August 2013

Identifying performance bottlenecks on a .NET windows app using Windows Debugging Tools and ANTS Profiler. Part I: NHibernate byte[] types

No comments · Posted by LizetP in .NET, CLR, Performance bottlenecks, Windows Debuging Tools

This is a curious case that led me to discover and use a very valuable tool ANTS Profiler and read a few good blogs about .NET debugging and CLR internals. Read on to bookmark with me.

Near to Christmas we received a complain one of the windows applications was performing too slow after a few hours of usage. Performance monitor counters indicated the performance problem lied on high CPU peaks sustained for a long period of time.

.Net memory counters were somewhat fine, no increase on allocated bytes or overall memory consumption, no high IO reads, no high network usage…apparently the application was just doing its stuff, but for a long time, and each time longer…

First thing that came to our mind was an infinite loop, however the curious part on this case is that the CPU peaks took longer the longer the end user worked on the application and began to be noticeable after a couple of hours, not quite the definition of an infinite loop.

If only had we had a better CPU, had the performance downgrade been noticeable after more hours. This is something we had to be thankful for, bad CPU, less time to reproduce the problem. This was one of the typical production only problems too :-p

Long sustained CPU peaks, how we dug down on the cause:

First we grabbed the free debugging tools (insert the obvious reasons here, budget, management approval, etc): CLRProfiler, winDGB, SOS and ADPlus.

Two great blog posts about how to start with these tools can be found here (Speaking Of Which) and here (Maoni’s blog).

MSDN Magazine also has two good articles (Bugslayer column and this CLR Inside Out column) on the subject of windows debugging tools and how to use them in VS 2005.

Back to our own experience on the matter, CLRProfiler hung the machine beyond response and despite being able to sketch the object graph in memory, it was hard to correlate the time of the high CPU peaks with the information obtained from CLRProfiler.

This was not due to a problem with the tool itself, hanging was due to poor iron power and our over-consuming application and the inability to detect the main CPU usage cause was due to the fact that CLR profiler is only meant to identify and isolate problems related with garbage collection, excessive long lived objects or huge collections.

At first we thought the high CPU could be related with garbage collection due to long lived objects, see this post on Tess’s blog If broken it is, fix it you should.

We collected memory dumps with ADPlus during the high CPU peaks as per this lab blog post and analyzed the memory dumps using WinDBG.

At the end we decided to have more control on when the dumps were taken and use WinDBG while attaching it to the process. I should also mentioned ADPlus ended up generating dumps with errors when the system was really stressed.

Instructions to take dump via WinDBG :
1. Run the application.
2. Open up WinDBG. Click on File ?> Attach To Process ?> Select the process ?> Click on OK.
3. WinDBG attaches to the process and waits on the command line. Press ‘g’ and hit enter. ‘g’ is for letting the
application run.
4. Now whenever you want to take a dump, Hit Ctrl+Break in WinDBG. Now, type : .dump /ma C:\Dump1.dmp
This will take a dump.
5. Press ‘g’ and hit enter for the process to resume.

WinDBG can give valuable information about the CLR stack at the time the dump was collected (clrstack command), the types being scheduled for GC (!finalizequeue) and how many types marked for finalization belong to Gen 0, Gen 1 and Gen2.

Seeing your managed stack at a single point in time or having exact information about the memory allocation does not give information on the amount/% of CPU time each method takes though.

We tried taking dumps with WinDbg at the beginning of the CPU peak, in the middle and at the end but the results only offered a hint, too many Collections were allocated and lived to Gen 2. Some of this collections were byte arrays. It wasn’t apparent from analyzing the three managed stacks (from the three memory dumps) which method was consuming the longest time.

So far we had lots of collections surviving to Generation 2 and some of them were of type byte array. Garbage collection counters, however were within the “normal”.

If the application was just “busy” doing its stuff, where was this time spent? Data Binding? Event Brokerage? Database access latency and query performance had been already discarded with SQL Profiler btw.

The main sustained CPU peak cause was discovered using ANTS profiler. Memory leaks, long GC cycles were discarded using the mentioned free tools.

ANTS profiler will let you set .NET performance counters and it will attach itself to the application being debugged. You cannot set breakpoints, afaik, but can go back in the profiler results and drag your cursor over a region to get a full called stack walk. It also goes beyond that and will indicate the % of time each method is taking on CPU and the % of time its children take on CPU usage.

Finally! A tool that will correlate performance counters the called stack for you and will indicate % of CPU time per method. This information you cannot gather by taking memory snapshots or called stack snapshots, unfortunately the free tools were only useful to discard memory leaks and GC related problems on this particular case. They narrowed down the places to look into.

As you can see from the ANTS Profiler screen shot the application was indeed doing stuff, in this case comparing collections of bytes, byte per byte…Ouch!

We were able to identify the Collection comparison problem (byte[] arrays were being compared when the
NHibernate session was flushed and persisted even when they didn’t changed). We correlated this with a
fixed NHibernate bug:
http://jira.nhibernate.org/browse/NH-1246
and changed our mapping attributes to indicate there was no need to update the BinaryBlob fields. Our application either inserts the binary data or deletes its.

Note: you should be logged into http://jira.nhibernate.org/ before navigating to this bug report, registration is free.

Our NHibernate version and mapping strategy contained the buggy bits…

I hope this post hasn’t turned out too long, by upgrading NHibernate we solved the mystery of performance downgrade over time, the more the user worked with persistent binary data in the application the longer this loop comparing byte per byte on each collection took.

Upgrading NHibernate added to a performance challenge in another area, the application start up was taking longer. This will go on Part II as I should get some sleep.

Sweet dreams!

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Friday, 16 August 2013

Best Buy Adding Microsoft Mini Stores at 600 Locations

Microsoft has teamed up with Best Buy to bring the Microsoft Store experience to the big box retailer.

Starting later this month, Microsoft "stores within a store" will start popping up at 500 Best Buy locations across the U.S., as well as 100 Best Buys and Future Shops in Canada.

The stores will be 1,500 to 2,200 square feet in size and will let shoppers try out Windows tablets and PCs, Windows Phones, Microsoft Office, Xbox, and more. Microsoft products will be still sold in other parts of Best Buy, though.

"We will have a great Windows tablet table inside the Best Buy Tablet Department, Windows Phones in their Phone Department, and lots of space for Xbox in their Gaming and Home Theater Departments," Chris Capossela, chief marketing officer at Microsoft, said in a blog post.

Best Buy has also added a Windows-centric section to its website.
Microsoft at Best Buy

Microsoft opted for Best Buy because they are the No. 1 retailer of PCs in the world, Capossela said. It will also be a good opportunity for shoppers to get their hands on Microsoft products - like the Surface tablet and new Windows 8-based machines - if they don't live near one of the 68 Microsoft Stores in North America.

During a February appearance at the Goldman Sachs Technology and Internet Conference, then-CFO Peter Klein said that hands-on time is key in getting people to pick up a new Windows gadget. "People really need to touch and see and play with it," Klein said of the Surface. As a result, Microsoft offered the Surface at more retailers with the Surface Pro launch.

"Retail is a priority, and this partnership with Best Buy is a prime example of our commitment to the customer experience in evaluating, experiencing and enjoying Microsoft devices, and the software and services that connect them," Capossela said.

Meanwhile, the king of tablets, Apple, has a huge retail presence all over the globe.

Best Buy has also provided the same kind of store-within-a-store boutique experience to Apple - and its biggest rival Samsung. In April, Samsung execs celebrated the grand opening of the Samsung Experience Shop at Best Buy's Union Square store. Samsung will open a chain of mini shops within Best Buy big box and Best Buy Mobile locations around the U.S., for a total of 1,400 by early June.




MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 3000+
Exams with Life Time Access Membership at http://www.actualkey.com

Thursday, 15 August 2013

Who Are the Real Winners in the Microsoft-Oracle Deal?

Who Are the Real Winners in the Microsoft-Oracle Deal?
Microsoft and Oracle announced a wide-ranging partnership deal in late June. At first glance, the logic behind it was fairly baffling.

The agreement lets Oracle customers run its software-including Java, Oracle Database and WebLogic Server-on Microsoft's Hyper-V hypervisor, in its Azure cloud or on Windows Server. The Oracle software will be certified to run on Microsoft platforms, meaning both companies will work to fix any problems that might arise.

Microsoft, for its part, will offer a fully licensed version of Java, as well as development tools and Oracle Linux in Azure.

The deal was difficult to understand because the two companies compete in the enterprise database marketplace. Why would Microsoft welcome Oracle's database into its Azure cloud to compete with its own SQL Azure? And why would Oracle want to undermine its own Oracle Cloud by encouraging Microsoft to offer Oracle software on Azure?

Oracle Needs to Put Its Head in Public Cloud
To get to grips with what's going on, it's important to understand that Oracle Cloud is weak and getting weaker by the month, says James Staten, a principal analyst at Forrester Research. "The company's public cloud offering has really had very little success, so Oracle really needs to find another way to get its software into the cloud."

Amazon already resells Oracle technologies on its Amazon Web Services cloud platform, so it makes sense to look at Azure as well, Staten says.

"Oracle can't just stop at Amazon. Azure is the No. 2 public cloud and it is growing, so it is absolutely in Oracle's interest to get its technology onto it," he points out, adding that Oracle can still say that its technologies run best on Oracle Cloud to preserve its dignity.

Ellison: Ellison: Oracle Will Deliver World's 'Most Comprehensive Cloud'
Holger Mueller, principal analyst at Constellation Research, agrees that Oracle's motivation for the deal is to make its technology available beyond its own public cloud offering. "Oracle is getting its database onto the public cloud provider of the future," he says. "This is a major coup, as now Google is the only major public cloud that doesn't support Oracle."

Microsoft Needs Java (Hold the Cream and Sugar)
What does Microsoft get out of the deal? Mueller believes Redmond has been very keen to get access to Java. "Amazon's AWS cloud gets developer support because it offers Java, and Microsoft is desperate to do the same," he says. "It wants to become the No. 1 IaaS vendor, but if it has no support for Java, then that becomes almost impossible."

In the past, Microsoft has supported Java on its Azure platform, but customers have had to "bring their own" Java-and their own license for it. To be able to have Java set up and ready to go on Azure, along with all the necessary tools, Microsoft needed its own Java license.

When Sun owned Java, Staten says, it didn't want to license it to Microsoft because it didn't want to make Windows stronger. By putting Oracle's database on Azure, Microsoft has found a way to get a Java license from Java's new owner.

Where does that leave Microsoft's SQL database? Staten points out that, since Azure is positioned as Windows in the cloud, any software that runs on premises should also be able to run on Azure. "This deal means that no one is forced to give up Oracle if they want to run their applications in the Azure cloud," he says.

Related: Microsoft Bulks Up Azure, Makes it Easier to Build, Expand Cloud Services
In any case, SQL isn't seen as a direct alternative to the more powerful and scalable Oracle database by most organizations, Staten says. "But for Microsoft, at least Oracle customers may be exposed to SQL Azure or SQL running in a VM as a consequence of this deal."

The partnership is also good for Microsoft because it's exclusive: You can virtualize Oracle software only in an Oracle VM or using Microsoft's virtualization technology. There's no technical reason why Oracle can't run on a VMware hypervisor, but that company has effectively been left out in the cold. "Oracle appears to be betting against VMware becoming a leader in the public cloud space," Staten says.

Everyone's a Winner
As for the known details of the deal, Microsoft will likely have to pay Oracle a significant sum for the Java license. In financial terms, then, Oracle would seem to be the main beneficiary of the deal in the short term. Oracle also benefits by ensuring that its technologies can be used in what one day may be the biggest public cloud.

Looking further, though, Microsoft may be the bigger winner. That's because Azure and Hyper-V gain a great deal of credibility from being certified platforms for Java and Oracle's database. It puts Microsoft Azure on nearly equal footing with AWS and definitely gives it an advantage over VMware's public cloud efforts.

But perhaps the biggest beneficiaries will be Microsoft's and Oracle's enterprise customers. The deal makes Azure a more competitive platform for them, as Oracle software is now fully supported in Microsoft's cloud. Previously customers would have had to have gone to AWS for that kind of support. Plus, increased competition is invariably good for business.



Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Saturday, 3 August 2013

Breaking down Google's impact on the Moto X

As the first device designed after Google's acquisition of Motorola, the Moto X is a good combination of both companies' services.

Moto X is the first completely new smartphone project that was launched after Google acquired Motorola Mobility. As such, it fully integrates the technology assets of both companies. It is a carefully designed, customizable mass-market consumer device with much embedded Google technology: speech recognition, contextual awareness, and personalized search. It’s available in 18 colors with 7 accent colors. The specifications are adequate for a high-end smartphone and meet or exceed most of the iPhone 5 specifications.

At the announcement in New York yesterday, Motorola Senior VP of Product Management Rick Osterloh introduced the Moto X with a personal demonstration. Rather than one big Apple or Samsung-like announcement with hundreds of people, Motorola held four personalized sessions for approximately 50 journalists at a time, allowing interactive questions.
Image Alt Text

Osterloh led with “Touchless Control.” Motorola adapted Google Now to utilize a proprietary always-on speech recognition function. It’s based on the Motorola X8 Computing System that combines a standard Qualcomm Snap Dragon S4 Pro dual-core CPU and quad-core GPU with two proprietary cores, one for natural language and the other for contextual computing.

The Moto X uses the natural language processor to monitor local sound sources at low power for the words “OK Google Now,” that when detected takes the smartphone out of a low-power state and turns the speech stream over to Google Now for recognition and a response through Google services, such as search and navigation. Osterloh said the Moto X is not listening to every word - it’s just listening for the signature of “OK Google Now” to awaken the smartphone. If Google Now’s speech recognition were constantly monitoring for this cue using ordinary hardware, the battery would quickly become drained.

The user can train the Moto X to recognize his or her voice. It’s not completely foolproof, as someone with a similar voice can prompt the Moto X to awaken. This was shown when an attendee at the event shouted "OK Google Now" and briefly took control of the device. The user can choose to add a password or PIN code to protect the device from unauthorized access, and a Bluetooth device, such as an in-car hands-free system, can be configured as a trusted command device, eliminating the need for password or code entry. Touchless Control was demonstrated to work at cocktail-party levels of ambient noise, and at a distance of up to eight or 10 feet.

Motorola’s researchers learned that the average person activates his or her smartphone 60 times a day, to check the time or respond to notifications. The Moto X uses the contextual processor to operate its “Active Display” to present time of day, missed calls, and notifications at low power without taking the smartphone out of sleep mode. Only a minimum number of pixels are illuminated, saving power by leaving the rest of the OLED display dark. The contextual processor recognizes if the smartphone is face down or in a pocket and does not illuminate the Active Display.

The 10-megapixel camera has three improvements. A twist of the wrist launches the camera without entering a password or PIN. The UI is simplified, moving most camera controls to a panel that can be exposed with a left-to-right swipe. This UI makes it possible to take a photo by touching any part of the screen, replacing the small blue icon that requires concentrated fine motor control to press. The camera is easier to focus and produces better images with an RGBC camera sensor that captures up to 75% more light when the picture is taken.

Most interesting is the user customization. The image at the beginning of this report gives one a sense of the many choices the consumer has to personalize the Moto X with a color scheme. The consumer can choose from two bezel colors, 18 back-plate covers, and seven accent colors, for a total of 252 unique combinations. The user can also add personalized text to the back of the Moto X, such as a name or email address that a good Samaritan might use to contact the owner if the smartphone is lost.

Motorola has created a web service called “Moto Maker” for consumers to use in visually sampling and choosing colors, accent colors and personalized text inscriptions. The suggested price is $199 with a carrier contract. Those interested in buying one can visit a carrier and purchase the Moto X at a contract price, where they will be given a voucher that includes a PIN number to enter into the Moto Maker web service to order the Moto X. Motorola said that it has organized its supply chain to assemble the Moto X in Fort Worth, Texas, with a four-day turnaround from order to shipping to customer. Consumers can also use Moto Maker to purchase directly from Motorola online.

Recognizing speech, understanding the meaning of speech and executing specific commands are priorities for Google. To this point, Google recently hired artificial intelligence expert Ray Kurzweil to lead engineering advances in speech technologies. Motorola may be pushing present-day speech technology to its limits. Moto X’s Touchless Control appears to have made at least an incremental improvement over Google Now and Apple Siri. Even if the incremental improvement in speech is not large, the combination of Touchless Control, Active Display, colorful customizability, and buying experience will drive consumer adoption. Google takes risks and innovates at a scale of many millions and billions. Whether the Moto X achieves Google scale remains to be seen.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com