performance

Results 1 - 25 of 3626Sort Results By: Published Date | Title | Company Name
Published By: IO     Published Date: Dec 31, 2015
The primary focus of any data center is the critical applications that form an enterprise’s business and operational core—its commercial and technical heart. Last-generation facilities, which were often designed using traditional methods created during the mainframe era, simply aren’t capable of handling the requirements of today’s businesscritical applications. Additionally, deploying a data center with inadequate or inappropriate resources, or placing the facility in the wrong location, can negatively impact application performance and, in turn, enterprise success.
Tags : 
    
IO
Published By: Digital Realty     Published Date: Dec 02, 2015
A comprehensive approach to security requires much more than simply installing locks and hiring security officers. While these remain important aspects of an effective security plan, they are part of a broader, more integrative approach to security in today’s dynamic environment. For data center operators, ensuring the security and continuity of their clients’ business operations is a key and compelling imperative. This paper has examined the elements and organization of a holistic approach to security. Digital Realty views security as an integrated process, consisting of the subprocesses of physical security, information security, incident management, business continuity and compliance, enabled by the systems, processes and people providing quality of delivery and reliability of performance. Absent any of these elements, security becomes a series of loosely related tasks lacking in cohesive effectiveness.
Tags : 
    
Digital Realty
Published By: CommScope     Published Date: Apr 15, 2016
The data center has assumed a new, more prominent role as a strategic asset within the organization. Increasing capacity demands and the pressure to support the “always-on” digital business are forcing data centers to adapt, evolve, and respond at an increasingly accelerated rate. Cloud, mobility, IoT, big data – these and other interrelated trends are putting enormous pressure on the modern data center. To keep pace, today’s physical infrastructure has become vastly more complex, interconnected, and performance-driven than a decade ago.-
Tags : 
    
CommScope
Published By: CyrusOne     Published Date: Jul 06, 2016
Many companies, especially those in the Oil and Gas Industry, need high-density deployments of high performance compute (HPC) environments to manage and analyze the extreme levels of computing involved with seismic processing. CyrusOne’s Houston West campus has the largest known concentration of HPC and high-density data center space in the colocation market today.
Tags : 
    
CyrusOne
Published By: Legrand     Published Date: Aug 09, 2016
Efficiency is a key objective when designing a data center. Efficiency gains are typically focused completely on power and cooling. Efficiencies can be realized in many other areas resulting in additional cost savings, reliable network performance, easier maintenance, flexibility, and scalability. The success and efficiency of the data center can be maximized by considering five key elements when designing: performance, time, space, experience, and sustainability
Tags : 
    
Legrand
Published By: Emerson Network Power     Published Date: Mar 18, 2016
Ponemon Institute and Emerson Network Power are pleased to present the results of the first Emerson Data Center IQ Quiz, part of the Data Center Performance Benchmark Series, which provides an industry-wide perspective on Availability, Security, Productivity, Cost and Speed of Deployment. The purpose of this study is to determine the domain knowledge of data center personnel while also collecting data on application of best practices and current operating conditions within participants’ data centers.
Tags : 
    
Emerson Network Power
Published By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
As businesses plunge into the digital future, no asset will have a greater impact on success than data. The ability to collect, harness, analyze, protect, and manage data will determine which businesses disrupt their industries, and which are disrupted; which businesses thrive, and which disappear. But traditional storage solutions are not designed to optimally handle such a critical business asset. Instead, businesses need to adopt an all-flash data center. In their new role as strategic business enablers, IT leaders have the responsibility to ensure that their businesses are protected, by investing in flexible, future-proof flash storage solutions. The right flash solution can deliver on critical business needs for agility, rapid growth, speed-to-market, data protection, application performance, and cost-effectiveness—while minimizing the maintenance and administration burden.
Tags : 
data, storage, decision makers, hpe
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
Applications are the engines that drive today’s digital businesses. When the infrastructure that powers those applications is difficult to administer, or fails, businesses and their IT organizations are severely impacted. Traditionally, IT assumed much of the responsibility to ensure availability and performance. In the digital era, however, the industry needs to evolve and reset the requirements on vendors.
Tags : 
financial, optimization, hpe, predictive, analytics
    
Hewlett Packard Enterprise
Published By: Aberdeen     Published Date: Jun 17, 2011
Download this paper to learn the top strategies leading executives are using to take full advantage of the insight they receive from their business intelligence (BI) systems - and turn that insight into a competitive weapon.
Tags : 
aberdeen, michael lock, data-driven decisions, business intelligence, public sector, analytics, federal, state
    
Aberdeen
Published By: Rohde & Schwarz Cybersecurity     Published Date: Nov 28, 2017
DPI software is made to inspect packets at high wire speeds and a critical factor is the throughput and resources required. Keeping the amount of resources that integrated DPI and application classification technology requires low is critical. The fewer cores (on a multi-core processor) and the less on-board memory an engine needs, the better. Multi-threading provides almost linear scalability on multi-core systems. In addition, highly-optimized flow tracking is required for handling millions of concurrent subscribers.
Tags : 
detection, rate, performance, efficiency, accuracy, encrypted apps, integration, metadata
    
Rohde & Schwarz Cybersecurity
Published By: Rohde & Schwarz Cybersecurity     Published Date: Nov 28, 2017
According to many market research analysts, the global wireless access point (WAP) market is anticipated to continue its upward trajectory and to grow at an impressive compound annual growth rate (CAGR) of approximately 8% through 2020. Many enterprises are utilizing cloudcomputing technology for cost-cutting purposes, eliminating investments required for storage hardware and other physical infrastructures. With significant growth expected in Internet usage, particularly bandwidth consuming video traffic, WAP vendors need to enable their customers to monitor and improve device performance, improve end user experience, and enhance security. These customers include general enterprises that offer Internet access to patrons like airports, hotels, retail / shopping centers and so on. These external Internet access providers can differentiate themselves by offering optimum service through advanced network analytics, traffic shaping, application control, security capabilities and more.
Tags : 
utilization, challenges, dpi, benefits, airport, public, wifi, qoe
    
Rohde & Schwarz Cybersecurity
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: Pure Storage     Published Date: Jan 12, 2018
Apache Spark has become a critical tool for all types of businesses across all industries. It is enabling organizations to leverage the power of analytics to drive innovation and create new business models. The availability of public cloud services, particularly Amazon Web Services, has been an important factor in fueling the growth of Spark. However, IT organizations and Spark users are beginning to run up against limitations in relying on the public cloud—namely control, cost and performance.
Tags : 
data, storage, scalability, cost efficiencies, pure storage
    
Pure Storage
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: CA Technologies     Published Date: Jul 20, 2017
e-book which lays out the case for machine learning and artificial intelligence for mainframe operational analytics. The mainframe is now part of a highly complex connected ecosystem driving trillions of mobile and web transactions critical to the functioning of the application economy. The emergence of new workloads and apps on the mainframe means that the status quo isn’t enough when it comes to Mainframe management. IT professionals alone – whether mainframe skilled or not – simply can’t keep up with the onslaught of performance alerts, false alarms. Machine learning deliver mainframe intelligence a more proactive and automated approach to handle this challenge.
Tags : 
    
CA Technologies
Published By: Akamai Technologies Australia     Published Date: Feb 08, 2018
Websites provide online businesses with an unprecedented level of contact with customers and end users. However, they also place business information where it can be easily accessed by third parties – often using automated tools known as “bots”. For many organizations, bots represent up to 50% or more of their overall website traffic, from good bots engaged in essential business tasks to bad bots conducting fraudulent activities. Regardless of business impact, bot traffic can reduce website performance for legitimate users and increase IT costs. Organizations need a flexible framework to better manage their interaction with different categories of bots and the impact that bots have on their business and IT infrastructure.
Tags : 
control, visibility, customer, financial risk, web fraud, bots, infrastructure
    
Akamai Technologies Australia
Published By: Oracle Dyn     Published Date: Dec 06, 2017
DNS speed and reliability are fundamental to the performance of your website and essential to your business. Contact Dyn today to learn how a supplemental DNS service can help you optimize DNS performance and improve user experiences. We can help you determine which multi- DNS option is best for your business and assist with planning and service integration efforts.
Tags : 
dns, service, management, traffic, vendors, delegation, protocol, traffic
    
Oracle Dyn
Published By: Oracle Dyn     Published Date: Dec 06, 2017
Creating a highly scalable, reliable, and efficient DNS infrastructure takes time, money, and expertise. You can accelerate your success and contain costs with a cloud-based service. Cloud-based DNS also exposes myriad opportunities to leverage the DNS infrastructure for global load balancing and traffic steering across hybrid environments.
Tags : 
dns, performance, reliability, network, global, anycast, customer, satisfaction
    
Oracle Dyn
Published By: TE Connectivity     Published Date: Feb 09, 2018
TE Connectivity (TE) high-performance relays, contactors and switches are designed specifically to operate in extremely rigorous environments in military and aerospace applications. Our relay products include COTS (commercial off-the-shelf), Mil-Spec, plus highly specialized, and custom-designed products. These high-performance products are designed to withstand extreme shock, vibration, temperature and altitude.
Tags : 
    
TE Connectivity
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
For increasing numbers of organizations, the new reality for development, deployment and delivery of applications and services is hybrid cloud. Few, if any, organizations are going to move all their strategic workloads to the cloud, but virtually every enterprise is embracing cloud for a wide variety of requirements. To accelerate innovation, improve the IT delivery economic model and reduce risk, organizations need to combine data and experience in a cognitive model that yields deeper and more meaningful insights for smarter decisionmaking. Whether the user needs a data set maintained in house for customer analytics or access to a cloud-based data store for assessing marketing program results — or any other business need — a high-performance, highly available, mixed-load database platform is required.
Tags : 
cloud, database, hybrid cloud, database platform
    
Group M_IBM Q1'18
Published By: IBM     Published Date: Oct 17, 2017
Every day, torrents of data inundate IT organizations and overwhelm the business managers who must sift through it all to glean insights that help them grow revenues and optimize profits. Yet, after investing hundreds of millions of dollars into new enterprise resource planning (ERP), customer relationship management (CRM), master data management systems (MDM), business intelligence (BI) data warehousing systems or big data environments, many companies are still plagued with disconnected, “dysfunctional” data—a massive, expensive sprawl of disparate silos and unconnected, redundant systems that fail to deliver the desired single view of the business. To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize
Tags : 
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm, cloud, cloud computing, database, ibm db2
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm db2, cloud, on-cloud applications, mixed-workload database
    
IBM
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Related Topics

Add Your White Papers

Get your white papers featured in the Data Center Frontier Paper Library contact:
Kevin@DataCenterFrontier.com