data

Results 26 - 50 of 10249Sort Results By: Published Date | Title | Company Name
Published By: CyrusOne     Published Date: Jul 02, 2016
Even through challenging economic times, the need for physical data center capacity continues to grow. For some businesses, the driver is expansion into new markets or geographies. For others, it's the need to deal with growing amounts of data generated by applications with high-capacity demands, evolving end-user abilities, or regulatory bodies that demand ever-increasing quantities of meticulous documentation. The "build-or-buy" decision between construction and colocation should be weighed carefully, as the choice will affect your company and your bottom line quite literally for decades. This executive report will review six key factors that affect that choice, some of which extend beyond a basic TCO analysis.
Tags : 
    
CyrusOne
Published By: CyrusOne     Published Date: Jul 06, 2016
Data centers help state and federal agencies reduce costs and improve operations. Every day, government agencies struggle to meet critical cost controls with lower operational expenses while fulfilling the Federal Data Center Consolidation Initiative’s (FDCCI) goal. All too often they are finding themselves constrained by their legacy in-house data centers and connectivity solutions that fail to deliver exceptional data center reliability and uptime.
Tags : 
    
CyrusOne
Published By: CyrusOne     Published Date: Jul 06, 2016
Many companies, especially those in the Oil and Gas Industry, need high-density deployments of high performance compute (HPC) environments to manage and analyze the extreme levels of computing involved with seismic processing. CyrusOne’s Houston West campus has the largest known concentration of HPC and high-density data center space in the colocation market today.
Tags : 
    
CyrusOne
Published By: CyrusOne     Published Date: Jul 06, 2016
CyrusOne’s quick-delivery data center product provides a solution for cloud technology, social media and enterprise companies that have trouble building or obtaining data center capacity fast enough to support their information technology (IT) infrastructure. In trying to keep pace with overwhelming business growth, these companies often find it hard to predict their future capacity needs. A delay in obtaining data center space can also delay or stop a company’s revenue-generating initiatives, and have significant negative impact on the bottom line.
Tags : 
    
CyrusOne
Published By: QTS Realty Trust, Inc.     Published Date: Jul 19, 2016
pyright © 2016, Data Center Frontier 3 SPECIAL REPORT Chicago Data Center Market Market Overview & Analysis The Windy City is a major hub for Internet and financial infrastructure, with active communities of data center users and service providers. Chicago is America’s third-largest city, and an active business market with nearly 40 Fortune 500 companies headquartered in the metro area. Chicago is distinctive in that it sees demand for data center space from a wide range of industries. It is home to major trading exchanges for stocks, commodities and options, making the city a hotbed of activity for the financial services industry. The region has also become a favored location for hosting, colocation and cloud computing companies. Chicago sees strong demand from the enterprise sector as well, both for primary data centers and as backup/disaster recovery facilities.
Tags : 
    
QTS Realty Trust, Inc.
Published By: Legrand     Published Date: Aug 09, 2016
Efficiency is a key objective when designing a data center. Efficiency gains are typically focused completely on power and cooling. Efficiencies can be realized in many other areas resulting in additional cost savings, reliable network performance, easier maintenance, flexibility, and scalability. The success and efficiency of the data center can be maximized by considering five key elements when designing: performance, time, space, experience, and sustainability
Tags : 
    
Legrand
Published By: Legrand     Published Date: Aug 09, 2016
The need for the “Connected Infrastructure of Tomorrow” is approaching faster than you think. With the Internet of Things (IoT) there are new and critical considerations to think about as one prepares for what’s ahead. Are you ready for it? To start planning and preparing now for tomorrow’s network, we have to understand the requirements for power, light and data today. We have to be smart about the future1. What will the world we live, play and work in be like 10-15 years from now? One can only guess, but there are some important things to consider today, when we design the infrastructure to support tomorrow’s needs being driven by billions of IoT connected devices.
Tags : 
    
Legrand
Published By: Intel     Published Date: Dec 31, 2015
A recent survey of 200 data center managers across the US and UK reveals that a large proportion of centers take a manual approach to planning and forecasting. Despite its limitations, MS Excel emerges as a popular tool and nearly one in ten resort to walking around a data center with a tape measure. Only just over half are able to benefit from using Data Center Infrastructure Management (DCIM) tools. The manual approach is not limited to smaller data centers by any means; the proportion was found to remain the same even amongst the larger data centers (with above 1,500 servers). When asked why manual methods were employed, 46% said it was because they felt that the alternatives would be too expensive. A further 35% feared they lacked the resources to implement a more automated approach. Whilst both these factors may seem reasonable enough at first sight, both might actually represent false economies in the longer run.
Tags : 
    
Intel
Published By: Intel     Published Date: May 06, 2016
Energy costs are the fastest-rising expense for today’s data centers. Naturally, power consumption is a top concern for managers of data center, enterprise, and cloud environments. Moreover, there are multiple proprietary power measurement and control protocols supported by various solution providers, making it challenging to have a single solution for power management across all devices in the data center. After a successful Intel IT proof of concept (PoC) using Intel® Data Center Manager (Intel® DCM) that ended in 2013, we deployed the solution across data centers in multiple countries. In our initial use of Intel DCM, we considered the solution to be focused primarily on gaining a better understanding of the power consumption and thermal status of servers. With broad deployment, we learned that Intel DCM is capable of much more.
Tags : 
    
Intel
Published By: Emerson Network Power     Published Date: Jan 21, 2016
Ponemon Institute and Emerson Network Power are pleased to present the results of the latest Cost of Data Center Outages study. Previously published in 2010 and 2013, the purpose of this third study is to continue to analyze the cost behavior of unplanned data center outages. According to our new study, the average cost of a data center outage has steadily increased from $505,502 in 2010 to $740,357 today (or a 38 percent net change).
Tags : 
    
Emerson Network Power
Published By: Emerson Network Power     Published Date: Mar 18, 2016
In this report, we’ll overview several alternative power configurations that improve overall cost and deployment speed while providing the availability levels required for this new generation.
Tags : 
    
Emerson Network Power
Published By: Emerson Network Power     Published Date: Mar 18, 2016
Ponemon Institute and Emerson Network Power are pleased to present the results of the first Emerson Data Center IQ Quiz, part of the Data Center Performance Benchmark Series, which provides an industry-wide perspective on Availability, Security, Productivity, Cost and Speed of Deployment. The purpose of this study is to determine the domain knowledge of data center personnel while also collecting data on application of best practices and current operating conditions within participants’ data centers.
Tags : 
    
Emerson Network Power
Published By: Entrust Datacard     Published Date: Nov 15, 2017
xx
Tags : 
    
Entrust Datacard
Published By: Entrust Datacard     Published Date: Nov 15, 2017
xx
Tags : 
    
Entrust Datacard
Published By: Spectrum Enterprise     Published Date: Feb 07, 2018
How Fiber Powers Growth – An Expert Q&A Guide provided by Spectrum Enterprise. Businesses today need bandwidth capacity to handle complex applications and ever-increasing data. See how technology experts rely on fiber to increase productivity and provide stronger growth opportunities.
Tags : 
    
Spectrum Enterprise
Published By: Cisco EMEA Tier 3 ABM     Published Date: Mar 05, 2018
The operation of your organization depends, at least in part, on its data. You can avoid fines and remediation costs, protect your organization’s reputation and employee morale, and maintain business continuity by building a capability to detect and respond to incidents effectively. The simplicity of the incident response process can be misleading. We recommend tabletop exercises as an important step in pressure-testing your program.
Tags : 
human resources, cisco, employees, data, analysis
    
Cisco EMEA Tier 3 ABM
Published By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
As businesses plunge into the digital future, no asset will have a greater impact on success than data. The ability to collect, harness, analyze, protect, and manage data will determine which businesses disrupt their industries, and which are disrupted; which businesses thrive, and which disappear. But traditional storage solutions are not designed to optimally handle such a critical business asset. Instead, businesses need to adopt an all-flash data center. In their new role as strategic business enablers, IT leaders have the responsibility to ensure that their businesses are protected, by investing in flexible, future-proof flash storage solutions. The right flash solution can deliver on critical business needs for agility, rapid growth, speed-to-market, data protection, application performance, and cost-effectiveness—while minimizing the maintenance and administration burden.
Tags : 
data, storage, decision makers, hpe
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Over the past several years, the IT industry has seen solid-state (or flash) technology evolve at a record pace. Early on, the high cost and relative newness of flash meant that it was mainly relegated to accelerating niche workloads. More recently, however, flash storage has “gone mainstream” thanks to maturing media technology. Lower media cost has resulted from memory innovations that have enabled greater density and new architectures such as 3D NAND. Simultaneously, flash vendors have refined how to exploit flash storage’s idiosyncrasies—for example, they can extend the flash media lifespan through data reduction and other technique
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Today’s data centers are expected to deploy, manage, and report on different tiers of business applications, databases, virtual workloads, home directories, and file sharing simultaneously. They also need to co-locate multiple systems while sharing power and energy. This is true for large as well as small environments. The trend in modern IT is to consolidate as much as possible to minimize cost and maximize efficiency of data centers and branch offices. HPE 3PAR StoreServ is highly efficient, flash-optimized storage engineered for the true convergence of block, file, and object access to help consolidate diverse workloads efficiently. HPE 3PAR OS and converged controllers incorporate multiprotocol support into the heart of the system architecture
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Modern storage arrays can’t compete on price without a range of data reduction technologies that help reduce the overall total cost of ownership of external storage. Unfortunately, there is no one single data reduction technology that fits all data types and we see savings being made with both data deduplication and compression, depending on the workload. Typically, OLTP-type data (databases) work well with compression and can achieve between 2:1 and 3:1 reduction, depending on the data itself. Deduplication works well with large volumes of repeated data like virtual machines or virtual desktops, where many instances or images are based off a similar “gold” master.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Within the next 12 months, solid-state arrays will improve in performance by a factor of 10, and double in density and cost-effectiveness, therefore changing the dynamics of the storage market. This Magic Quadrant will help IT leaders better understand SSA vendors' positioning in the market.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Business users expect immediate access to data, all the time and without interruption. But reality does not always meet expectations. IT leaders must constantly perform intricate forensic work to unravel the maze of issues that impact data delivery to applications. This performance gap between the data and the application creates a bottleneck that impacts productivity and ultimately damages a business’ ability to operate effectively. We term this the “app-data gap.”
Tags : 
    
Hewlett Packard Enterprise
Published By: Oracle     Published Date: Jan 08, 2018
Data is a driver of growth and change that is quickly becoming the world's most valuable resource. As such, finance leaders face increased pressure to value data as an asset on their balance sheets and use it to drive business strategy. To capitalize on the power of data, learn how to pinpoint where you are today. what steps you can take to reach your goals and how to measure success.
Tags : 
    
Oracle
Published By: Aberdeen     Published Date: Jun 17, 2011
Download this paper to learn the top strategies leading executives are using to take full advantage of the insight they receive from their business intelligence (BI) systems - and turn that insight into a competitive weapon.
Tags : 
aberdeen, michael lock, data-driven decisions, business intelligence, public sector, analytics, federal, state, governmental, decisions, data management
    
Aberdeen
Published By: Oracle     Published Date: Feb 28, 2018
Le divisioni IT hanno utilizzato IaaS per liberare il personale da compiti ripetitivi quali la manutenzione dell’hardware e l’aggiornamento del software. Ma non sempre sono stati sfruttati i possibili vantaggi correlati al processo di adozione. Quasi un quarto delle aziende (il 22%) sostiene che se dovesse ripetere l’implementazione di IaaS, utilizzerebbe strumenti per la migrazione automatica. Oracle Ravello, ad esempio, permette alle aziende di trasferire automaticamente i carichi di lavoro dai data center esistenti alle piattaforme cloud senza costose o rischiose modifiche.
Tags : 
sfruttare, vantaggio, competitivo, sviluppare, innovazione
    
Oracle
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Related Topics

Add Your White Papers

Get your white papers featured in the Data Center Frontier Paper Library contact:
Kevin@DataCenterFrontier.com