transaction data

Results 26 - 50 of 161Sort Results By: Published Date | Title | Company Name
Published By: Pure Storage     Published Date: Jul 03, 2019
Splunk® has become a mission critical application. Thousands of organizations are gaining insight from their machine data and transaction logs using Splunk, and many more are planning to deploy Splunk. No matter what stage you’re in, having guidelines to follow can help improve the Splunk experience. Since a mission critical data application deserves a mission critical data platform, Pure Storage® built the solution on the Pure FlashStack™ converged infrastructure solution. FlashStack is a joint offering from Cisco® and Pure Storage. This paper is intended to provide a framework for designing and sizing a high-performance, scalable, and resilient Splunk platform. Pure Storage is a leading all-flash array provider focused on reducing storage complexity while improving Splunk performance, resiliency, and efficiency. To assure that your Splunk platform is sized appropriately, Pure Storage tested Splunk Enterprise on Pure’s FlashStack platform. The top takeaways from these efforts are tha
Tags : 
    
Pure Storage
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
In the end, the Dell EMC VMAX 250F with Intel® Xeon® Processor All Flash storage array lived up to its promises better than the HPE 3PAR 8450 Storage array did. We experience minimal impact to database performance when the VMAX 250F processed transactional and data mart loading at the same time. This is useful whether you're performing extensive backups or compiling large amounts of data from multiple sources. Intel Inside®. New Possibilities Outside.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Compression algorithms reduce the number of bits needed to represent a set of data—the higher the compression ratio, the more space this particular data reduction technique saves. During our OLTP test, the Unity array achieved a compression ratio of 3.2-to-1 on the database volumes, whereas the 3PAR array averaged a 1.3-to-1 ratio. In our data mart loading test, the 3PAR achieved a ratio of 1.4-to-1 on the database volumes, whereas the Unity array got 1.3 to 1.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
When your company’s work demands a new storage array, you have the opportunity to invest in a solution that can support demanding workloads simultaneously—such as online transaction processing (OLTP) and data mart loading. At Principled Technologies, we compared Dell EMC™ PowerEdge™ R930 servers1 with the Intel® Xeon® Processor Dell EMC Unity 400F All Flash storage array to HPE ProLiant DL580 Gen9 servers with the HPE 3PAR 8400 array in three hands-on tests to determine how well each solution could serve a company during these database-intensive tasks. Intel Inside®. New Possibilities Outside.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Prevent unexpected downtime with reliable failover protection. We interrupted access both local storage arrays - the Dell EMC database host seamlessly redirected all I/O to remote VMAX 250F with Intel® Xeon® Processor via SRDF/Metro with no interruption of service or downtime. The 3PAR solution crashed until the standby paths became active and we restarted the VM.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Pour conclure, la baie de stockage Dell EMC VMAX 250F All Flash a mieux tenu ses promesses que la baie de stockage HPE 3PAR 8450. Le système VMAX 250F a traité le chargement transactionnel et du DataMart simultanément avec un impact minimal sur les performances de la base de données. Ces performances sont utiles lors de l’exécution de sauvegardes complètes ou la compilation de grandes quantités de données provenant de plusieurs sources.
Tags : 
    
Dell PC Lifecycle
Published By: Pure Storage     Published Date: Apr 10, 2019
Splunk® has become a mission critical application. Thousands of organizations are gaining insight from their machine data and transaction logs using Splunk, and many more are planning to deploy Splunk. No matter what stage you’re in, having guidelines to follow can help improve the Splunk experience. Since a mission critical data application deserves a mission critical data platform, Pure Storage® built the solution on the Pure FlashStack™ converged infrastructure solution. FlashStack is a joint offering from Cisco® and Pure Storage. This paper is intended to provide a framework for designing and sizing a high-performance, scalable, and resilient Splunk platform. Pure Storage is a leading all-flash array provider focused on reducing storage complexity while improving Splunk performance, resiliency, and efficiency.
Tags : 
    
Pure Storage
Published By: Dynatrace     Published Date: Apr 26, 2017
It's impossible to optimize every page and action of every transaction for every device and user location...you need to identify the pages and actions that matter most and build an optimization plan. This report details how T-Mobile did exactly that, and how you can do the same: Base your plan on your own business and visitor data Correlate performance to transaction completion rate Determine where you'll see the most return for your technology and time investment Download the report to read more.
Tags : 
digital experience, digital experience monitoring
    
Dynatrace
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: IBM     Published Date: Oct 13, 2016
Smart on-line transaction processing systems will be able to leverage transactions and big data analytics on-demand, on an event-driven basis and in real-time for competitive advantage. Download to learn how!
Tags : 
big data, operational decision making, transaction data, it management, data management, data center, analytics, data science, data storage
    
IBM
Published By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : 
data. queries, database operations, transactional databases, clustering, it management, storage, business technology, data storage
    
IBM
Published By: Brother     Published Date: Mar 08, 2018
Documents are an integral component to the successful operation of an organization. Whether in hardcopy or digital form, they enable the communication, transaction, and recording of business-critical information. To ensure documents are used effectively, organizations are encouraged to continually evaluate and improve surrounding workflows. This may involve automating elements of document creation, securing the transfer and storage of information, and/or simplifying the retrieval of records and the data contained within. These types of enhancements can save time, money, and frustration. This white paper will discuss top trends and requirements in the optimization of document-related business processes as well as general technology infrastructures for document management. It will also address how some office technology vendors have reacted to these trends to guide their design and development of products, solutions, and services.
Tags : 
documents, workflows, business process, document management
    
Brother
Published By: Dynatrace     Published Date: Jul 29, 2016
Gap free data helps you create and manage high-performing applications that deliver flawless end-user experience and customer loyalty. To be gap free, you must capture data from every single method in your application infrastructure, end-to-end, including timing and code-level context for all transactions, services and tiers, and make the data available for analysis. This eBook gives you technical and business case details that will show you why gap free data is a critical part of your application management strategy.
Tags : 
dynatrace, gap free data, applications, application performance, development, devops, application infrastructure, application performance management, user experience, application management, software development, enterprise applications, business technology
    
Dynatrace
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: CA Technologies     Published Date: Aug 22, 2017
Effectively supporting these new business demands has become more complex and challenging. The increased use of mobile devices alone is driving exponential growth in transaction volumes. A customer pushes a button on his or her cell phone, for example, to check a bank balance. That single transaction triggers a cascade of transactions as the request is validated and data is accessed, retrieved and then sent back to the customer.
Tags : 
storage, systems, network, applications, data, automation, ca technologies
    
CA Technologies
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this technical whitepaper to learn how data architects and DBAs can avoid the struggle of complex scripting for Kafka in modern data environments. You’ll also gain tips on how to avoid the time-consuming hassle of manually configuring data producers and data type conversions. Specifically, this paper will guide you on how to overcome these challenges by leveraging innovative technology such as Attunity Replicate. The solution can easily integrate source metadata and schema changes for automated configuration real-time data feeds and best practices.
Tags : 
data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics, database transactions, streaming environments, real-time data replication, data configuration
    
Attunity
Published By: Intel     Published Date: Dec 13, 2018
Technology plays a key role in online shopping, where online retailers gain a greater understanding of their customers through data from their browsing and purchasing habits. Today, when consumers shop in brick-and-mortar stores, they expect the same personalized and responsive service. To help retailers achieve this level of service, a combination of hardware and software—Intel® Vision Accelerator Design products, cameras, AI deep learning video analysis technology— do the work for you. Uncover how Advantech system uses the Intel Vision Accelerator Design with Intel Movidius VPU to drive • Overall store performance such as the number of visitors and transactions, point-of-sale data, sales per shopper and the store’s ranking, and can distinguish traffic patterns by weather and time of day • Traffic and sales analysis for better staff allocation and marketing-event planning • Store heatmap analysis for more precise merchandise placement and product promotion
Tags : 
    
Intel
Published By: HERE Technologies     Published Date: Sep 26, 2018
Advertisers are beginning to invest in location insights which give them data on how and why transactions are made in specific places. U.S. marketers are poised to double their spend on location-targeted mobile ads between 2017 and 2021 to $32 billion, according to research firm BIA/Kelsey. Understanding location is key to gaining insights and making change. HERE offers data sets and services that advertisers can use to contextualize consumer movements and habits in the world around them. This gives them well-timed and relevant advertising.
Tags : 
location data, ad tech, location targeting
    
HERE Technologies
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Jan 31, 2019
LTI built a transaction monitoring cognitive data lake to facilitate AML transaction monitoring across post trade transactions for a leading global bank, which resulted in reduction of human errors by 30% and TAT improvement by 50%. Download Complete Case Study.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Jan 31, 2019
LTI helped a leading global bank digitize its traditional product ecosystem for AML transaction monitoring. With the creation of a data lake and efficient learning models, the bank successfully reduced false positives and improved customer risk assessment. Download Complete Case Study.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: IBM     Published Date: May 17, 2016
Analyst Mike Ferguson of Intelligent Business Strategies writes about the enhanced role of transactional DBMS systems in today's world of Big Data. Learn more about how Big Data provides richer transactional data and how that data is captured and analyzed to meet tomorrow’s business needs. Access the report now.
Tags : 
ibm, intelligent business solutions, big data, transaction data, business analytics
    
IBM
Published By: IBM     Published Date: Jul 05, 2016
This white paper is written for SAP customers evaluating their infrastructure choices, discussing database technology evolution and options available.
Tags : 
ibm, always on business, analytics, database, database platform, sap, blu, db2, networking, knowledge management, enterprise applications, storage, data management, business technology, data storage, data security
    
IBM
Published By: Workday     Published Date: Jan 17, 2019
At Workday, we take a unique approach to development. We build one technology platform, on a single codeline, giving you a single security model and one source of truth. See how our singular methodology delivers real-time transactional data everyone can trust.
Tags : 
methodology, security model, technology
    
Workday
Published By: Rackspace     Published Date: Apr 15, 2019
Scale events — like online sales and digital product launches — present great revenue opportunities, but they also present large risks to your business. Whether you are a retailer preparing for Black Friday and Cyber Monday, or a digital vendor launching a new service, your brand is both at its most visible and its most vulnerable during these scale events. Many more customers visit your site over a short period of time, raising the potential for resource constraints and discovery of software bugs. Information about issues spreads quickly via social media and news outlets. And, your customers typically spend more per transaction, so every lost order has a greater negative impact on your bottom line. Site reliability engineering (SRE) can help you better prepare for scale events through an iterative cycle of data-driven improvement.
Tags : 
    
Rackspace
Start   Previous    1 2 3 4 5 6 7    Next    End
Search      

Related Topics

Add Your White Papers

Get your white papers featured in the Data Center Frontier Paper Library contact:
Kevin@DataCenterFrontier.com