data analysis

Results 1 - 25 of 350Sort Results By: Published Date | Title | Company Name
Published By: CyrusOne     Published Date: Jul 02, 2016
Even through challenging economic times, the need for physical data center capacity continues to grow. For some businesses, the driver is expansion into new markets or geographies. For others, it's the need to deal with growing amounts of data generated by applications with high-capacity demands, evolving end-user abilities, or regulatory bodies that demand ever-increasing quantities of meticulous documentation. The "build-or-buy" decision between construction and colocation should be weighed carefully, as the choice will affect your company and your bottom line quite literally for decades. This executive report will review six key factors that affect that choice, some of which extend beyond a basic TCO analysis.
Tags : 
    
CyrusOne
Published By: QTS Realty Trust, Inc.     Published Date: Jul 19, 2016
pyright © 2016, Data Center Frontier 3 SPECIAL REPORT Chicago Data Center Market Market Overview & Analysis The Windy City is a major hub for Internet and financial infrastructure, with active communities of data center users and service providers. Chicago is America’s third-largest city, and an active business market with nearly 40 Fortune 500 companies headquartered in the metro area. Chicago is distinctive in that it sees demand for data center space from a wide range of industries. It is home to major trading exchanges for stocks, commodities and options, making the city a hotbed of activity for the financial services industry. The region has also become a favored location for hosting, colocation and cloud computing companies. Chicago sees strong demand from the enterprise sector as well, both for primary data centers and as backup/disaster recovery facilities.
Tags : 
    
QTS Realty Trust, Inc.
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: Oracle     Published Date: Oct 20, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most compani
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Ounce Labs, an IBM Company     Published Date: Dec 15, 2009
Today, when you make decisions about information technology (IT) security priorities, you must often strike a careful balance between business risk, impact, and likelihood of incidents, and the costs of prevention or cleanup. Historically, the most well-understood variable in this equation was the methods that hackers used to disrupt or invade the system.
Tags : 
ounce labs, it securitym it risk, software applications, pci dss, hipaa, glba, data security, source code vulnerabilities
    
Ounce Labs, an IBM Company
Published By: Ounce Labs, an IBM Company     Published Date: Jul 08, 2009
The Business Case for Data Protection, conducted by Ponemon Institute and sponsored by Ounce Labs, is the first study to determine what senior executives think about the value proposition of corporate data protection efforts within their organizations. In times of shrinking budgets, it is important for those individuals charged with managing a data protection program to understand how key decision makers in organizations perceive the importance of safeguarding sensitive and confidential information.
Tags : 
ounce labs, it securitym it risk, software applications, ciso, pci dss, hipaa, glba, data security
    
Ounce Labs, an IBM Company
Published By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
    
SAP
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: HP Enterprise Business     Published Date: Mar 02, 2017
Powered by data from 451 Research, the Right Mix web application benchmarks your current private vs public cloud mix, business drivers, and workload deployment venues against industry peers to create a comparative analysis. See how your mix stacks up, then download the 451 Research report for robust insights into the state of the hybrid IT market.
Tags : 
    
HP Enterprise Business
Published By: Pentaho     Published Date: Nov 04, 2015
Although the phrase “next-generation platforms and analytics” can evoke images of machine learning, big data, Hadoop, and the Internet of things, most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. Next-generation platforms and analytics often mean simply pushing past reports and dashboards to more advanced forms of analytics, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis.
Tags : 
pentaho, analytics, platforms, hadoop, big data, predictive analytics, networking, it management
    
Pentaho
Published By: Cisco     Published Date: Jun 21, 2016
The demands on IT today are staggering. Most organizations depend on their data to drive everything from product development and sales to communications, operations, and innovation. As a result, IT departments are charged with finding a way to bring new applications online quickly, accommodate massive data growth and complex data analysis, and make data available 24 hours a day, around the world, on any device. The traditional way to deliver data services is with separate infrastructure silos for various applications, processes, and locations, resulting in continually escalating costs for infrastructure and management. These infrastructure silos make it difficult to respond quickly to business opportunities and threats, cause productivity-hindering delays when you need to scale, and drive up operational costs.
Tags : 
    
Cisco
Published By: Adobe     Published Date: Nov 09, 2017
Patients are going digital — and taking the healthcare system with them. Learn how in the 2017 Digital Trends in Healthcare and Pharma report. Download it now to learn: Why two-thirds of healthcare companies are investing in data analysis. How they’re building content marketing programs to boost patient knowledge. What they plan to do with virtual and augmented reality this year and beyond.
Tags : 
    
Adobe
Published By: Oracle CX     Published Date: Oct 19, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most companies
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Cloudian     Published Date: Feb 15, 2018
We are living in an age of explosive data growth. IDC projects that the digital universe is growing 50% a year, doubling in size every 2 years. In media and entertainment, the growth is even faster as capacity-intensive formats such as 4K, 8K, and 360/VR gain traction. Fortunately, new trends in data storage are making it easier to stay ahead of the curve. In this paper, we will examine how object storage stacks up against LTO tape for media archives and backup. In addition to a detailed total cost of ownership (TCO) analysis covering both capital and operational expenses, this paper will look at the opportunity costs of not leveraging the real-time data access of object storage to monetize existing data. Finally, we will demonstrate the validity of the analysis with a real-world case study of a longstanding network TV show that made the switch from tape to object storage. The limitations of tape storage go way beyond its lack of scalability. Data that isn’t searchable is becoming
Tags : 
    
Cloudian
Published By: IBM APAC     Published Date: Nov 22, 2017
Glossary:Terms You Should Know Take your data analysis and deep learning expertise to the next level with these industry terms.
Tags : 
api, application, programming, interface, artificial intelligence, base, encoding, chrome
    
IBM APAC
Published By: Delphix     Published Date: May 03, 2016
Data security is a top concern these days. In a world of privacy regulation, intellectual property theft, and cybercrime, ensuring data security and protecting sensitive enterprise data is crucial. Only a data masking solution can secure vital data and enable outsourcing, third-party analysis, and cloud deployments. But more often than not, masking projects fail. Some of the best data masking tools bottleneck processes and once masked, data is hard to move and manage across the application development lifecycle.
Tags : 
    
Delphix
Published By: Dell EMC     Published Date: Nov 09, 2015
Download this whitepaper and learn how the Dell Genomic Data Analysis Platform can accelerate discovery and insights through optimized infrastructure and support.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Oct 08, 2015
Download this whitepaper for: • An overview of how manufacturing can benefit from the big data technology stack • A high-level view of common big data pain points for manufacturers • A detailed analysis of big data technology for manufacturers • A view as to how manufacturers are going about big data adoption • A proven case study with: Omneo
Tags : 
    
Dell EMC
Published By: Dell     Published Date: Jan 26, 2015
Forrester presents the relevant endpoint security data from their most recent surveys, with special attention given to those trends affecting SMBs (firms with 20 to 999 employees) and enterprises (firms with 1,000+ employees), along with analysis that explains the data in the context of the overall security landscape. As organizations prepare for the 2015 budget cycle, security and risk (S&R) professionals should use this annual report to help benchmark their organization’s spending patterns against those of their peers — while keeping an eye on current trends affecting endpoint security — in order to strategize their endpoint security adoption decisions. Please download this Forrester Research report, offered compliments of Dell, for more information.
Tags : 
dell, windows, endpoint security adoption, forrester, smbs, budget cycle, security
    
Dell
Published By: Collaborative Consulting     Published Date: Dec 20, 2013
The explosion of Big Data represents an opportunity to leverage trending attitudes in the marketplace to better segment and target customers, and enhance products and promotions. Success requires establishing a common business rationale for harnessing social media and determining a maturity model for sentiment analysis to assess existing social media capabilities.
Tags : 
collaborative consulting, big data, social media, customer sentiment, influence sentiment, govern sentiment, maturity model, generate revenue
    
Collaborative Consulting
Published By: SAS     Published Date: Jul 14, 2015
With sophisticated analytics, government leaders can pinpoint the underlying value in all their data. They can bring it together in a unified fashion and see connections across agencies to better serve citizens.
Tags : 
anayltics, data analysis, management, knowledge pooling, data infrastructure
    
SAS
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14    Next    End
Search      

Related Topics

Add Your White Papers

Get your white papers featured in the Data Center Frontier Paper Library contact:
Kevin@DataCenterFrontier.com