SOFTWARE

 

Research: 53% deploying or evaluating microservers

 Microservers are replacing more traditional servers at some organizations. Tech Pro Research reveals who is using them, who is not, and the reasons driving these decisions.

dell poweredge microserver.jpeg 

Servers remain an integral part of IT services, and microservers are an option for many organizations, with their reduced power usage, smaller space needs and lower costs.

To find out who is using microservers, and why, and how this form factor might help organizations, Tech Pro Research conducted an online survey in February and March 2014. There were 167 respondents, from companies of all sizes around the world. The resulting report, Microservers: The newest data center innovation, revealed that many companies are making room for microservers in their IT budgets.

Server innovation

There are several different types of servers, from traditional rack-mount servers, to converged systems, hybrid systems and now, the microserver. Given that the microserver is a relatively new form factor, it's hard for some organizations to decide where to use them, if at all. Companies of all sizes are either testing, evaluating, or are already deploying microservers. They fill a void for certain situations by being at the right price point. 
Many consider microservers to be a significant innovation in the data center. When comparing responses from companies of all sizes, 69% of respondents said they are a significant innovation. Breaking it down by company size revealed that the smaller companies, with 249 or fewer employees, were even more impressed with microservers, with 75% of those in organizations with 50-249 employees considering them significant, and 74% of those in companies with fewer than 50 employees.

 microservers.jpgOverall, 53% of respondents said they are working on microserver projects, either having already deployed them, or evaluating them before possibly purchasing.

 microserver plans.jpg
It's interesting to note that microservers are being evaluated more than being deployed or tested. This is a sign that microservers might not make it to the next generation of mainstream hardware, but it isn't a solid indication by any means.

Reasons for not using microservers

Of those respondents not planning to leverage microservers for their IT departments, they said that they had a preference for traditional platforms and also mentioned were I/O limitations. Supportability, security and trustworthiness of the microserver platform accounted for 36% of the responses of those who voiced anti-microserver thoughts, as seen in the following chart:
 reasons for not using microservers.jpgOther topics covered in the research report include:

  • Physical locations of microservers
  • Why companies are using microservers
  • Preferred vendors
  • Network and connectivity options with microservers
  • Total number of servers in use


Microservers: The newest data center innovation

  

The microserver has become an option for companies today for computing needs, for certain use cases. Chosen wisely, this new arrival on the server form factor spectrum can solve an immediate need for companies. Reduced cost, lower space and power requirements are among the options that entice companies to this form factor for server class systems.

Tech Pro Research conducted an online survey of 167 individuals to find out who is using microservers, and why. To find out the results of the survey, download the full report, Microservers: The newest data center innovation.

 

Resource and Data Recovery Policy

  

Employees, data and resources are three of the biggest assets in any given organization. With concepts such as mobility and cloud access becoming nearly ubiquitous, the ability of company staff to protect and recover data and/or resources (“resources” being defined herein as “a company data repository such as a wiki or help desk knowledgebase”) spread out across disparate environments is especially vital for continued success.

Even if company information is kept solely in-house, all employees should be familiar with the processes for recovering information if it becomes lost, inaccessible or compromised. Companies that do use cloud services should be aware of proper guidelines to follow if an agreement with an outside vendor providing access to data is being canceled.

Download Tech Pro Research's Resource and Data Recovery Policy to use as a template or a standalone policy to provide guidelines for the recovery of data from company-owned or company purchased resources, equipment and/or services.

 

 

100+ IT policies at your fingertips, ready for download

From BYOD and social media to ergonomics and encryption, Tech Pro Research has dozens of ready-made, downloadable IT policy templates.
Crafting an effective IT policy can be a daunting and expensive task.
You could spend hours writing it yourself, but consider how much your time is worth. The average salary of an IT Director/Manager in the U.S. is about $100,000 (depending on geographic location, company, education, etc.). Over a year, that salary breaks down to about $48 per hour. If it takes you one work day to write an IT policy, that single policy cost you $384 ($48 x 8 hours).
Don't have time to write a policy? You can pay a consultant hundreds of dollars to create one for you or buy a single policy online for $25, $50, or more. You may even be able to find a handful of policies bundled together, but these packages can cost over $500.


Coverity finds open source software quality better than proprietary code

 The irony isn't lost on me: Coverity, a a company specializing in software quality and security testing solution, has found that open source software has fewer defects in its code than proprietary programs in the aftermath of open-source OpenSSL Heartbleed programming fiasco. Nevertheless, the numbers don't lie and the 2013 Coverity Scan Open Source Report (PDF Link) found that open source had fewer errors per thousand lines of code (KLoC) than proprietary software.

 CoverityErrorRate2014Coverity found that open-source programs tend to have less errors per thousand lines of code than their proprietary software brothers.

The Coverity Scan service, which the study was based on, was started with the US Department of Homeland Security in 2006. The project was designed to give hard answers to questions about open source software quality and security.
For this latest Coverity Scan Report, the company analyzed code from more than 750 open source C/C++ projects as well as an anonymous sample of enterprise projects. In addition, the report highlights analysis results from several popular, open source Java projects that have joined the Scan service since March 2013. Specifically, the company scanned the code of C/C++ programs, such as NetBSD, FreeBSD, LibreOffice, and Linux, and Java projects such as Apache Hadoop, HBase, and Cassandra.
The 2013 report's key findings included:
  • Open source code quality surpasses proprietary code quality in C/C++ projects. Defect density (defects per 1,000 lines of software code) is a commonly used measurement for software quality, and a defect density of 1.0 is considered the accepted industry standard for good quality software. Coverity’s analysis found an average defect density of .59 for open source C/C++ projects that leverage the Scan service, compared to an average defect density of .72 for proprietary C/C++ code developed for enterprise projects. In 2013, code quality of open-source projects using the Scan service surpassed that of proprietary projects at all code base sizes, which further highlights the open source community’s strong commitment to development testing.
  • Linux continues to be a benchmark for open source quality. By leveraging the Scan service, Linux has reduced the average time to fix a newly detected defect from 122 days to just six. Since the original Coverity Scan Report in 2008, scanned versions of Linux have consistently achieved a defect density of less than 1.0. In 2013, Coverity scanned more than 8.5 million lines of Linux code and found a defect density of .61.
  • C/C++ developers fixed more high-impact defects than Java developers. The Coverity analysis found that developers contributing to open source Java projects are not fixing as many high-impact defects as developers contributing to open source C/C++ projects. Java project developers participating in the Scan service only fixed 13 percent of the identified resource leaks, whereas participating C/C++ developers fixed 46 percent. This could be caused in part by a false sense of security within the Java programming community, due to protections built into the language, such as garbage collection. However, garbage collection can be unpredictable and cannot address system resources so these projects are at risk.
  • Apache HBase serves as benchmark for Java projects. Coverity analyzed more than eight million lines of code from 100 open source Java projects, including popular big data projects Apache Hadoop 2.3 (320,000 lines of code), HBase (487,000 lines of code), and Apache Cassandra (345,000 lines of code). Since joining the Scan service in August 2013, Apache HBase — which is Hadoop’s database — fixed more than 220 defects, including a much higher percentage of resource leaks compared to other Java projects in the Scan service (i.e., 66 percent for HBase compared to 13 percent on average for other projects).
Zack Samocha, senior director of products for Coverity, said in a statement, "Our objective with the Coverity Scan service is to help the open source community create high-quality software. Based on the results of this report — as well as the increasing popularity of the service — open source software projects that leverage development testing continue to increase the quality of their software, such that they have raised the bar for the entire industry."
Coverity also announced that it has opened up access to the Coverity Scan service, allowing anyone interested in open source software to view the progress of participating projects. Individuals can now become Project Observers, which enables them to track the state of relevant open source projects in the Scan service and view high-level data including the count of outstanding defects, fixed defects, and defect density.
"We’ve seen an exponential increase in the number of people who have asked to join the Coverity Scan service, simply to monitor the defects being found and fixed. In many cases, these people work for large enterprise organizations that utilize open source software within their commercial projects," concluded Samocha. "By opening up the Scan service to these individuals, we are now enabling a new level of visibility into the code quality of the open-source projects, which they are including in their software supply chain."



When Apple's New iPhone Software Arrives Next Week, It Will Crash A Lot Of Apps, Software Tester Says.



iphone 5c itunes radio


Don't be in a hurry to upgrade to the new iPhone software, iOS 7, when it's released to the public on Wednesday unless you like buggy apps that crash all the time, an app testing company is warning.
An enormous number of iPhone app makers are behind the eight ball in getting their apps to work properly with iOS 7, says Matt Johnston, chief marketing officer of uTest, a Massachusetts-based startup that does crowdsourced testing of mobile apps for companies like Google, Amazon, HBO, USA Today.
Among uTest's customers, "90% of iOS apps tested for first time are having trouble," Johnston said. Apps are having performance problems in twice as many areas as they typically do and it's taking developers three or four tries to fix all the things that iOS 7 breaks.
And by trouble he means big stuff, like crashing or mis-sized fonts will cut off the text.
iOS 6 apps that use the swipe up function could have big problems. That gesture won't be available to the app with iOS 7. Apple has commandeered swipe-up to bring up its new "Command Center."
Apple warned that the change from iOS 6 to iOS 7 is the biggest change in years, but a lot of app developers weren't listening.
"I just talked to one large retail player and one large travel player and asked about iOS 7 upgrade plans and they said they didn't think it will that big a deal," he told us.
This is different, and far worse, than the usual iOS upgrade.
"When we moved from iOS 4 to 5 to 6, we weren't seeing this kind of spike in both performance issues and UI rendering issues," he says.
The app problems won't last long, maybe a few weeks. So if you can hold off from upgrading that long you might be thankful. That's Johnston's personal plan.
"I will not be the first one to upgrade iOS 7," he laughs.


 Молненная надпись XP

Today I want to tell a little to you about the software, or we will tell so, about resources which I use at a stage of optimization of the blog. You probably already know about some and you used/use them for optimization. So, we will start.

1 . be1.ru/stat/ — this resource is very simple and clear. TITs can use it for the analysis of indexation of sites, such parameters as and PR, the robots.txt file analysis. In a network there is a lot of similar services therefore in it there is nothing special that could differ from others, but nevertheless...

2 . page-weight.ru — this service, is more exact the program which developed by this service allows to analyze internal parameters of sites, such as proceeding links, headings, "beaten" links. And still the hugest plus which this program uses, it is the help in the correct upgrade  of a site, by means of the analysis of internal "scales" of pages on a special formula of calculation of PR.

3 . home.snafu.de/tilman/xenulink.html — when using this program you will be able to analyze your site on existence of beaten links, and also to analyze headings of pages and to form a site map. This program works very quickly and besides it is absolute the free.

4 . ahrefs.com — this service English-speaking, but you can't understand it. It is intended for your viewing an anchor sheet, and also you can trace dynamics of purchase of exiles.

5 . solomono.ru — this already all service known for us which is intended for the analysis of proceeding links from a site. And the main plus of this service is that solomono allows to receive beklinka of your resource.

6 . promolab.ru/free/ — this service is intended for the analysis of density of keywords on the page. Using this service it is possible to grind very easily article under 100% relevance on the necessary key inquiry, and thus to watch a perespamlennost of pages.

No comments:

Post a Comment