transaction data

Results 1 - 25 of 101Sort Results By: Published Date | Title | Company Name
Published By: CA Technologies     Published Date: Aug 22, 2017
Effectively supporting these new business demands has become more complex and challenging. The increased use of mobile devices alone is driving exponential growth in transaction volumes. A customer pushes a button on his or her cell phone, for example, to check a bank balance. That single transaction triggers a cascade of transactions as the request is validated and data is accessed, retrieved and then sent back to the customer.
Tags : 
storage, systems, network, applications, data, automation, ca technologies
    
CA Technologies
Published By: Workday     Published Date: Aug 11, 2017
This Workday webinar featuring Paul Hamerman of Forrester Research, explores the power of uniting workforce and financial planning, budgeting, and forecasting. Learn best practices that help your organization succeed, such as embedding planning, budgeting, and forecasting into a single system with transaction data.
Tags : 
    
Workday
Published By: Oracle Hardware     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle Hardware
Published By: Oracle Hardware     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle Hardware
Published By: StrongMail     Published Date: Jun 08, 2008
The growing trend towards insourcing marketing and transactional email is being driven by businesses that are looking for ways to improve their email programs, increase data security and lower costs. When evaluating whether it makes more sense to leverage an on-premise or outsourced solution, it's important to understand how the traditional arguments have changed.
Tags : 
strongmail, social media, email marketing, on-premise advantage, transactional email, insourcing, bounce management, spam trappers, emarketing, roi, networks, social networking
    
StrongMail
Published By: SAP     Published Date: May 18, 2014
This white paper discusses the issues involved in the traditional practice of deploying transactional and analytic applications on separate platforms using separate databases. It analyzes the results from a user survey, conducted on SAP's behalf by IDC, that explores these issues.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: SAP     Published Date: Nov 16, 2017
The SAP HANA platform has successfully become a proven mainstay data management solution, supporting the full range of analytic and transactional software from SAP, both in the data center and in the cloud.
Tags : 
    
SAP
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Dell EMC     Published Date: Oct 08, 2015
To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository.
Tags : 
    
Dell EMC
Published By: ExtraHop     Published Date: Apr 03, 2013
The ExtraHop Discovery Edition is a free virtual appliance will help you to discover the performance of your applications across the network, web, VDI, database, and storage tiers. Get yours today!
Tags : 
it operational intelligence, application performance management, application performance monitoring, application monitoring, network performance management, network performance monitoring, network monitoring, infrastructure performance monitoring, network monitoring, infrastructure performance monitoring, infrastructure monitoring, business transaction management, end-user experience monitoring, end-user monitoring, real-user monitoring, operations analytics, cloud performance monitoring, web performance monitoring, database performance monitoring, citrix performance monitoring
    
ExtraHop
Published By: Entrust Datacard     Published Date: Apr 26, 2017
Research in the SSL/TLS security market points to a growing need for securing web applications with high assurance certificates issued by a reputable Certification Authority (CA). The integrity of the CA and the extended services offered through a certificate management platform (CtaaS) can produce a truly secure IT environment for website transactions according to industry analysts, Frost & Sullivan, in their in-depth analysis, SSL/TLS Certificates Market, Finding the Business Model in an All Encrypt World. Organizations want to avoid the negative publicity associated with security breaches and customers want to be assured of data protection when making online transactions. In this condensed report, catch the highlights of current industry trends and the ever important need to secure your server with a reputable CA.
Tags : 
    
Entrust Datacard
Published By: CA Technologies     Published Date: Aug 26, 2016
Organizations handling transactions involving credit or debit cards are facing increasing pressure to meet regulatory compliance mandates. In particular, they must comply with the Payment Card Industry Data Security Standard (PCI DSS) version 3, which went into effect in January of 2015.
Tags : 
    
CA Technologies
Published By: Dynatrace     Published Date: Apr 26, 2017
It's impossible to optimize every page and action of every transaction for every device and user location...you need to identify the pages and actions that matter most and build an optimization plan. This report details how T-Mobile did exactly that, and how you can do the same: Base your plan on your own business and visitor data Correlate performance to transaction completion rate Determine where you'll see the most return for your technology and time investment Download the report to read more.
Tags : 
digital experience, digital experience monitoring
    
Dynatrace
Published By: IBM     Published Date: Oct 13, 2016
Smart on-line transaction processing systems will be able to leverage transactions and big data analytics on-demand, on an event-driven basis and in real-time for competitive advantage. Download to learn how!
Tags : 
big data, operational decision making, transaction data, it management, data management, data center, analytics, data science, data storage
    
IBM
Published By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : 
data. queries, database operations, transactional databases, clustering, it management, storage, business technology, data storage
    
IBM
Published By: Dynatrace     Published Date: Jul 29, 2016
Gap free data helps you create and manage high-performing applications that deliver flawless end-user experience and customer loyalty. To be gap free, you must capture data from every single method in your application infrastructure, end-to-end, including timing and code-level context for all transactions, services and tiers, and make the data available for analysis. This eBook gives you technical and business case details that will show you why gap free data is a critical part of your application management strategy.
Tags : 
dynatrace, gap free data, applications, application performance, development, devops, application infrastructure, application performance management, user experience, application management, software development, enterprise applications, business technology
    
Dynatrace
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: IBM     Published Date: May 17, 2016
Analyst Mike Ferguson of Intelligent Business Strategies writes about the enhanced role of transactional DBMS systems in today's world of Big Data. Learn more about how Big Data provides richer transactional data and how that data is captured and analyzed to meet tomorrow’s business needs. Access the report now.
Tags : 
ibm, intelligent business solutions, big data, transaction data, business analytics
    
IBM
Published By: IBM     Published Date: Jul 05, 2016
This white paper is written for SAP customers evaluating their infrastructure choices, discussing database technology evolution and options available.
Tags : 
ibm, always on business, analytics, database, database platform, sap, blu, db2, networking, knowledge management, enterprise applications, storage, data management, business technology, data storage, data security
    
IBM
Published By: NetApp     Published Date: Feb 19, 2015
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
    
NetApp
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
flash pool, fas storage systems, ssd, online transaction processing, cluster storage
    
NetApp
Published By: IBM     Published Date: May 23, 2017
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : 
cloud strategy, database projects, disaster recover, geographic reach, large database, ibm, analytics, management optimization
    
IBM
Start   Previous   1 2 3 4 5    Next    End
Search      

Related Topics

Add Your White Papers

Get your white papers featured in the Data Center Frontier Paper Library contact:
Kevin@DataCenterFrontier.com