"Tech Solutions - one byte at a time!"
DigiBytes.com is the digital library of solutions for business and technology professionals.

Login Register Login
For Admins

data volumes

Results 1 - 25 of 89Sort Results By: Published Date | Title | Company Name
Published By: Mimecast     Published Date: Oct 11, 2018
Information management is getting harder. Organizations face increasing data volumes, more stringent legal and regulatory record-keeping requirements, stricter privacy rules, increasing threat of breaches and decreasing employee productivity. Companies are also finding that their old-fashioned, legacy archive strategies are increasingly ineffective. This is driving many organizations to rethink their approach, developing more modern Information Governance strategies.
Tags : 
    
Mimecast
Published By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
    
Group M_IBM Q418
Published By: Amazon Web Services     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
Amazon Web Services
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Modern storage arrays can’t compete on price without a range of data reduction technologies that help reduce the overall total cost of ownership of external storage. Unfortunately, there is no one single data reduction technology that fits all data types and we see savings being made with both data deduplication and compression, depending on the workload. Typically, OLTP-type data (databases) work well with compression and can achieve between 2:1 and 3:1 reduction, depending on the data itself. Deduplication works well with large volumes of repeated data like virtual machines or virtual desktops, where many instances or images are based off a similar “gold” master.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Jul 19, 2018
The next wave of cloud storage innovation is upon us. It’s called multicloud. With multicloud storage you can combine cloud simplicity with enterprise-grade reliability, provide data mobility among multiple cloud types, and eliminate vendor lock-in. And it’s available right now through the Nimble Cloud Volumes service.
Tags : 
cloud, storage, flash
    
Hewlett Packard Enterprise
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: Oracle CX     Published Date: Oct 19, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make better
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 19, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make better
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 19, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most companies
Tags : 
    
Oracle CX
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Komprimierungsalgorithmen sorgen dafür, dass weniger Bit benötigt werden, um einen bestimmten Datensatz zu repräsentieren. Je höher das Komprimierungsverhältnis, desto mehr Speicherplatz wird durch dieses spezielle Datenreduzierungsverfahren eingespart. Während unseres OLTP-Tests erreichte das Unity-Array bei den Datenbank-Volumes ein Komprimierungsverhältnis von 3,2:1, während das 3PAR-Array im Schnitt nur ein Verhältnis von 1,3:1 erreichte. In unserem Data Mart-Ladetest erzielte das 3PAR bei den Datenbank-Volumes ein Verhältnis von 1,4:1, das Unity-Array nur 1,3:1.
Tags : 
    
Dell PC Lifecycle
Published By: Dell EMC     Published Date: Nov 10, 2015
From your most critical workloads to your cold data, a scale-out or scale-up storage solution — one that can automatically tier volumes or data to the most appropriate arrays or media (flash SSDs or HDDs) and offers advanced software features to help ensure availability and reliability — can help you efficiently manage your data center.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Oct 08, 2015
To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository.
Tags : 
    
Dell EMC
Published By: IBM APAC     Published Date: Jul 09, 2017
Organizations today collect a tremendous amount of data and are bolstering their analytics capabilities to generate new, data-driven insights from this expanding resource. To make the most of growing data volumes, they need to provide rapid access to data across the enterprise. At the same time, they need efficient and workable ways to store and manage data over the long term. A governed data lake approach offers an opportunity to manage these challenges. Download this white paper to find out more.
Tags : 
data lake, big data, analytics
    
IBM APAC
Published By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
    
Amazon Web Services
Published By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : 
api, security, interface, businesses, data, mobile, adoption
    
Akamai Technologies
Published By: Akamai Technologies     Published Date: Apr 13, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : 
api, security, interface, businesses, data, mobile, adoption
    
Akamai Technologies
Published By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : 
api, security, interface, businesses, data, mobile, adoption
    
Akamai Technologies
Published By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : 
api, security, interface, businesses, data, mobile, adoption
    
Akamai Technologies
Published By: MarkLogic     Published Date: Jun 09, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic
Tags : 
    
MarkLogic
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Compression algorithms reduce the number of bits needed to represent a set of data—the higher the compression ratio, the more space this particular data reduction technique saves. During our OLTP test, the Unity array achieved a compression ratio of 3.2-to-1 on the database volumes, whereas the 3PAR array averaged a 1.3-to-1 ratio. In our data mart loading test, the 3PAR achieved a ratio of 1.4-to-1 on the database volumes, whereas the Unity array got 1.3 to 1.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Les algorithmes de compression réduisent le nombre de bits nécessaires pour représenter un ensemble de données. Plus le taux de compression est élevé, plus cette technique de réduction des données permet d’économiser de l’espace. Lors de notre test OLTP, la baie Unity a atteint un taux de compression de 3,2 pour 1 sur les volumes de base de données. De son côté, la baie 3PAR affichait en moyenne un taux de 1,3 pour 1. Sur le test de chargement DataMart, la baie 3PAR a atteint un taux de 1,4 pour 1 sur les volumes de bases de données, tandis que la baie Unity enregistrait un taux de 1,3 pour 1.
Tags : 
    
Dell PC Lifecycle
Published By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
    
CA Technologies_Business_Automation
Published By: Riverbed     Published Date: May 24, 2012
Data transfer bottlenecks and unpredictability on the Wide Area Network (WAN) can hurt application performance. In addition, the time required to migrate large volumes of data to and from data centers can be a serious concern to business continuity.
Tags : 
wan, wan optimization, riverbed, virtual steelhead, data center
    
Riverbed
Start   Previous   1 2 3 4    Next    End
Search      

Special Report

What does it take to succeed in today’s hypercompetitive and hyperconnected digital economy? Keep reading to find out.

Add Research

Get your company's research in the hands of targeted business professionals.

Modern Analyst Media Modern Analyst Media
Modern Analyst Requirements Modern Analyst Media Modern Analyst DigiBytes
Copyright 2009-2014 by Modern Analyst Media LLC Home  |  Featured Bytes  |  Popular Bytes  |  All Topics  |  Vendor Directory