View

Sorting

Products found: 9

logo
Offer a reference bonus
0.00

Amazon EMR

Amazon EMR provides a managed Hadoop framework that makes it easy, fast, and cost-effective to process vast amounts of data across dynamically scalable Amazon EC2 instances. You can also run other popular distributed frameworks such as Apache Spark, HBase, Presto, and Flink in EMR, and interact with data in other AWS data stores such as Amazon S3 and Amazon DynamoDB. EMR Notebooks, based on the popular Jupyter Notebook, provide a development and collaboration environment for ad hoc querying and exploratory analysis. EMR securely and reliably handles a broad set of big data use cases, including log analysis, web indexing, data transformations (ETL), machine learning, financial analysis, scientific simulation, and bioinformatics.

 

BENEFITS

EASY TO USE You can launch an EMR cluster in minutes. You don’t need to worry about node provisioning, cluster setup, Hadoop configuration, or cluster tuning. EMR takes care of these tasks so you can focus on analysis. Data scientists, developers and analysts can also use EMR Notebooks, a managed environment based on Jupyter Notebook, to build applications and collaborate with peers. LOW COST EMR pricing is simple and predictable: You pay a per-instance rate for every second used, with a one-minute minimum charge. You can launch a 10-node EMR cluster with applications such as Hadoop, Spark, and Hive, for as little as $0.15 per hour. Because EMR has native support for Amazon EC2 Spot and Reserved Instances, you can also save 50-80% on the cost of the underlying instances. ELASTIC With EMR, you can provision one, hundreds, or thousands of compute instances to process data at any scale. You can easily increase or decrease the number of instances manually or with Auto Scaling, and you only pay for what you use. EMR also decouples compute instances and persistent storage, so they can be scaled independently. RELIABLE You can spend less time tuning and monitoring your cluster. EMR has tuned Hadoop for the cloud; it also monitors your cluster — retrying failed tasks and automatically replacing poorly performing instances. EMR provides the latest stable open source software releases, so you don’t have to manage updates and bug fixes, leading to fewer issues and less effort to maintain the environment. SECURE EMR automatically configures EC2 firewall settings that control network access to instances, and you can launch clusters in an Amazon Virtual Private Cloud (VPC), a logically isolated network you define. For objects stored in S3, you can use S3 server-side encryption or Amazon S3 client-side encryption with EMRFS, with AWS Key Management Service or customer-managed keys. You can also easily enable other encryption options and authentication with Kerberos. FLEXIBLE You have complete control over your cluster. You have root access to every instance, you can easily install additional applications, and you can customize every cluster with bootstrap actions. You can also launch EMR clusters with custom Amazon Linux AMIs.
... Learn more
ROI-calculator
Configurator
ROI-
-
9
9
logo
Offer a reference bonus
2.00

Amazon S3

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance. This means customers of all sizes and industries can use it to store and protect any amount of data for a range of use cases, such as websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics. Amazon S3 provides easy-to-use management features so you can organize your data and configure finely-tuned access controls to meet your specific business, organizational, and compliance requirements. Amazon S3 is designed for 99.999999999% (11 9's) of durability, and stores data for millions of applications for companies all around the world. Main benefits:
Industry-leading performance, scalability, availability, and durability Scale your storage resources up and down to meet fluctuating demands, without upfront investments or resource procurement cycles. Amazon S3 is designed for 99.999999999% of data durability because it automatically creates and stores copies of all S3 objects across multiple systems. This means your data is available when needed and protected against failures, errors, and threats. Wide range of cost-effective storage classes Save costs without sacrificing performance by storing data across the S3 Storage Classes, which support different data access levels at corresponding rates. You can use S3 Storage Class Analysis to discover data that should move to a lower-cost storage class based on access patterns, and configure an S3 Lifecycle policy to execute the transfer. You can also store data with changing or unknown access patterns in S3 Intelligent-Tiering, which tiers objects based on changing access patterns and automatically delivers cost savings. Unmatched security, compliance, and audit capabilities Store your data in Amazon S3 and secure it from unauthorized access with encryption features and access management tools. You can also use Amazon Macie to identify sensitive data stored in your S3 buckets and detect irregular access requests. Amazon S3 maintains compliance programs, such as PCI-DSS, HIPAA/HITECH, FedRAMP, EU Data Protection Directive, and FISMA, to help you meet regulatory requirements. AWS also supports numerous auditing capabilities to monitor access requests to your S3 resources. Management tools for granular data control Classify, manage, and report on your data using features, such as: S3 Storage Class Analysis to analyze access patterns; S3 Lifecycle policies to transfer objects to lower-cost storage classes; S3 Cross-Region Replication to replicate data into other regions; S3 Object Lock to apply retention dates to objects and protect them from deletion; and S3 Inventory to get visbility into your stored objects, their metadata, and encryption status. You can also use S3 Batch Operations to change object properties and perform storage management tasks for billions of objects. Since Amazon S3 works with AWS Lambda, you can log activities, define alerts, and automate workflows without managing additional infrastructure. Query-in-place services for analytics Run big data analytics across your S3 objects (and other data sets in AWS) with our query-in-place services. Use Amazon Athena to query S3 data with standard SQL expressions and Amazon Redshift Spectrum to analyze data that is stored across your AWS data warehouses and S3 resources. You can also use S3 Select to retrieve subsets of object metadata, instead of the entire object, and improve query performance by up to 400%. Most supported cloud storage service Store and protect your data in Amazon S3 by working with a partner from the AWS Partner Network (APN) — the largest community of technology and consulting cloud services providers. The APN recognizes migration partners that transfer data to Amazon S3 and storage partners that offer S3-integrated solutions for primary storage, backup and restore, archive, and disaster recovery. You can also purchase an AWS-integrated solution directly from the AWS Marketplace, which lists of hundreds storage-specific offerings.
... Learn more
ROI-calculator
Configurator
ROI-
-
1
10
logo
Offer a reference bonus
0.00

Amazon Storage Gateway

AWS Storage Gateway is a hybrid storage service that enables your on-premises applications to seamlessly use AWS cloud storage. You can use the service for backup and archiving, disaster recovery, cloud data processing, storage tiering, and migration. The service helps you reduce and simplify your datacenter and branch or remote office storage infrastructure. Your applications connect to the service through a virtual machine or hardware gateway appliance using standard storage protocols, such as NFS, SMB and iSCSI. The gateway connects to AWS storage services, such as Amazon S3, Amazon S3 Glacier, Amazon S3 Glacier Deep Archive, Amazon EBS, and AWS Backup, providing storage for files, volumes, snapshots, and virtual tapes in AWS. The service includes a highly-optimized data transfer mechanism, with bandwidth management, automated network resilience, and efficient data transfer, along with a local cache for low-latency on-premises access to your most active data. FEATURES: Introducing Storage Gateway AWS Storage Gateway is a hybrid cloud storage service that connects your existing on-premises environments with the AWS Cloud. Its features make it easy for you to run hybrid cloud workloads at any stage of your cloud adoption, whether it's getting started with cloud backups, running cloud processing workflows for data generated by on-premises machines, or performing a one-time migration of block volume data or databases. Key Features Standard Storage Protocols. Storage Gateway seamlessly connects to your local production or backup applications with NFS, SMB, iSCSI, or iSCSI-VTL, so you can adopt AWS Cloud storage without needing to modify your applications. Its protocol conversion and device emulation enables you to access block data on volumes managed by Storage Gateway on top of Amazon S3, store files as native Amazon S3 objects, and keep virtual tape backups online in a Virtual Tape Library backed by S3 or move the backups to a tape archive tier on Amazon Glacier. Fully Managed Cache. The local gateway appliance maintains a cache of recently written or read data so your applications can have low-latency access to data that is stored durably in AWS. The gateways use a read-through and write-back cache. Optimized and Secured Data Transfer. Storage Gateway provides secure upload of changed data and secure downloads of requested data, encrypting data in transit between any type of gateway appliance and AWS using SSL. Optimizations such as multi-part management, automatic buffering, and delta transfers are used across all gateway types, and data compression is applied for all block and virtual tape data. AWS Integrated. As a native AWS service, Storage Gateway integrates with other AWS services for storage, backup, and management. The service stores files as native Amazon S3 objects, archives virtual tapes in Amazon Glacier, and stores EBS Snapshots generated by the Volume Gateway with Amazon EBS. Storage Gateway also integrates with AWS Backup to manage backup and recovery of Volume Gateway volumes, simplifying your backup management, and helping you meet your business and regulatory backup compliance requirements. Additionally, Storage Gateway provides a consistent management experience using the AWS Console, both for on-premises gateways, and for monitoring, management and security with AWS services such as Amazon CloudWatch, AWS CloudTrail, AWS Identity and Access Management (IAM), and AWS Key Management Service (KMS). Gateway Types File Gateway The File Gateway presents a file interface that enables you to store files as objects in Amazon S3 using the industry-standard NFS and SMB file protocols, and access those files via NFS and SMB from your datacenter or Amazon EC2, or access those files as objects with the S3 API. POSIX-style metadata, including ownership, permissions, and timestamps are durably stored in Amazon S3 in the user-metadata of the object associated with the file. Once objects are transferred to S3, they can be managed as native S3 objects, and bucket policies such as versioning, lifecycle management, and cross-region replication apply directly to objects stored in your bucket. Customers use the File Gateway to store file data into S3 for use by object-based workloads including data analytics or machine learning, as a cost-effective storage target for backups, and as a repository or tier in the cloud for application file storage. Volume Gateway The Volume Gateway presents your applications block storage volumes using the iSCSI protocol. Data written to these volumes can be asynchronously backed up as point-in-time snapshots of your volumes, and stored in the cloud as Amazon EBS snapshots. You can back up your on-premises Volume Gateway volumes using the service’s native snapshot scheduler or the AWS Backup service. In both the cases, volume backups are stored as Amazon EBS snapshots in AWS. These snapshots are incremental backups that capture only changed blocks. All snapshot storage is also compressed to minimize your storage charges. When connecting to the Volume Gateway with the iSCSI block interface, you can run the gateway in two modes: cached and stored. In cached mode, you store your primary data in Amazon S3 and retain your frequently accessed data locally in cache. With this mode, you can achieve substantial cost savings on primary storage, minimizing the need to scale your storage on-premises, while retaining low-latency access to your frequently accessed data. In stored mode, you store your entire data set locally, while making an asynchronous copy of your volume in Amazon S3 and point-in-time EBS snapshots. This mode provides durable and inexpensive offsite backups that you can recover locally, to another site or in Amazon EC2. Customers often choose the volume gateway to backup local applications, and use it for disaster recovery based on EBS Snapshots, or Cached Volume Clones. The Volume Gateway integration with AWS Backup enables customers to use the AWS Backup service to protect on-premises applications that use Storage Gateway volumes. AWS Backup supports backup and restore of both cached and stored volumes. Using AWS Backup with Volume Gateway helps you centralize backup management, reduce your operational burden, and meet compliance requirements. AWS Backup enables you to:
  • Set customizable scheduled backup policies that meet your backup requirements;
  • Set backup retention and expiration rules so you no longer need to develop custom scripts or manually manage the point-in-time backups of your volumes; and
  • Manage and monitor backups across multiple gateways and other AWS resources from a central view.
Tape Gateway The Tape Gateway presents itself to your existing backup application as an industry-standard iSCSI-based virtual tape library (VTL), consisting of a virtual media changer and virtual tape drives. You can continue to use your existing backup applications and workflows while writing to a nearly limitless collection of virtual tapes. Each virtual tape is stored in Amazon S3. When you no longer require immediate or frequent access to data contained on a virtual tape, you can have your backup application move it from the Storage Gateway Virtual Tape Library into an archive tier that sits on top of Amazon Glacier cloud storage, further reducing storage costs. Storage Gateway is currently compatible with most leading backup applications. The Tape Gateway’s VTL interface eliminates large upfront tape automation capital expenses, multi-year maintenance contract commitments and ongoing media costs. You pay only for the capacity you use and scale as your needs grow. The need to transport storage media to offsite facilities and handle tape media manually goes away, and your archives benefit from the design and durability of the AWS cloud platform. Storage Gateway Deployment Options The AWS Storage Gateway service consists of its in-cloud components, including the management console, storage infrastructure and back-end control and integration services and APIs, and the gateway appliance that you deploy and connect to your applications. You have four options for deployment: Either a virtual machine containing the Storage Gateway software, which can run on VMware ESXi, Microsoft Hyper-V on premises, as a hardware appliance on premises, as a VM in VMware Cloud on AWS, as an AMI in Amazon EC2. Storage Gateway as a hardware appliance Storage Gateway is available pre-installed on a hardware appliance, a Dell EMC PowerEdge R640XL server with a validated configuration. The hardware appliance provides a simple procurement, deployment, and management experience for customers who have limited virtualized infrastructure, burdensome centralized resource provisioning processes, or limited IT staffing. AWS Storage Gateway pricing You pay only for what you use with the AWS Storage Gateway and are charged based on the type and amount of storage you use, the requests you make, and the amount of data transferred out of AWS. BENEFITS Integrated Hybrid cloud storage means your data can be used on-premises and stored durably in AWS Cloud storage services, including Amazon S3, Amazon S3 Glacier, Amazon S3 Glacier Deep Archive, and Amazon EBS. Once data is moved to AWS, you can apply AWS compute, machine learning, and big data analytics services to it. Additionally, you can leverage the full AWS portfolio of security and management services including AWS Backup, AWS KMS, AWS Identity and Access Management (IAM), SNS workflows, Amazon CloudWatch and AWS CloudTrail. Performance AWS Storage Gateway caches data in the local VM or hardware gateway appliance, providing low-latency disk and network performance for your most active data, with optimized data transfers occurring to AWS Cloud storage tiers in the background. Users and applications continue to operate using a local storage model while you take advantage of a cloud back-end. Optimized transfers Compression, encryption and bandwidth management are built in. Storage Gateway manages local cache offloads to the cloud based on your desired performance parameters, so you can fine-tune the balance of latency and scale for your workloads. Only data that changes is transferred, so you can optimize your network bandwidth. Simple No disruptions required. Download and install the virtual machine or deploy the dedicated hardware appliance, select an interface and assign local cache capacity. The advanced networking and protocol support are all included, which means no clients to install, and no network and or firewall settings to tune. And the virtual appliance can run both on-premises as well as in Amazon EC2 to serve your in-cloud applications. Durable and secure Data stored through AWS Storage Gateway benefits from the durabilty and security embedded in AWS cloud storage services. Storage management tools like versioning, cross-region replication, and lifecycle management policies can lower the cost of long-term archiving, simplify audit and compliance requirements, and safeguard all of your data, not just the parts kept on-premises. All data that Storage Gateway transfers to AWS is encrypted in transit, and encrypted at rest in AWS.
... Learn more
ROI-calculator
Configurator
ROI-
-
17
7
logo
Offer a reference bonus
0.00

AWS Auto Scaling

AWS Auto Scaling monitors your applications and automatically adjusts capacity to maintain steady, predictable performance at the lowest possible cost. Using AWS Auto Scaling, it’s easy to setup application scaling for multiple resources across multiple services in minutes. The service provides a simple, powerful user interface that lets you build scaling plans for resources including Amazon EC2 instances and Spot Fleets, Amazon ECS tasks, Amazon DynamoDB tables and indexes, and Amazon Aurora Replicas. AWS Auto Scaling makes scaling simple with recommendations that allow you to optimize performance, costs, or balance between them. If you’re already using Amazon EC2 Auto Scaling to dynamically scale your Amazon EC2 instances, you can now combine it with AWS Auto Scaling to scale additional resources for other AWS services. With AWS Auto Scaling, your applications always have the right resources at the right time. It’s easy to get started with AWS Auto Scaling using the AWS Management Console, Command Line Interface (CLI), or SDK. AWS Auto Scaling is available at no additional charge. You pay only for the AWS resources needed to run your applications and Amazon CloudWatch monitoring fees.

Benefits

SETUP SCALING QUICKLY AWS Auto Scaling lets you set target utilization levels for multiple resources in a single, intuitive interface. You can quickly see the average utilization of all of your scalable resources without having to navigate to other consoles. For example, if your application uses Amazon EC2 and Amazon DynamoDB, you can use AWS Auto Scaling to manage resource provisioning for all of the EC2 Auto Scaling groups and database tables in your application. MAKE SMART SCALING DECISIONS AWS Auto Scaling lets you build scaling plans that automate how groups of different resources respond to changes in demand. You can optimize availability, costs, or a balance of both. AWS Auto Scaling automatically creates all of the scaling policies and sets targets for you based on your preference. AWS Auto Scaling monitors your application and automatically adds or removes capacity from your resource groups in real-time as demands change. AUTOMATICALLY MAINTAIN PERFORMANCE Using AWS Auto Scaling, you maintain optimal application performance and availability, even when workloads are periodic, unpredictable, or continuously changing. AWS Auto Scaling continually monitors your applications to make sure that they are operating at your desired performance levels. When demand spikes, AWS Auto Scaling automatically increases the capacity of constrained resources so you maintain a high quality of service. PAY ONLY FOR WHAT YOU NEED AWS Auto Scaling can help you optimize your utilization and cost efficiencies when consuming AWS services so you only pay for the resources you actually need. When demand drops, AWS Auto Scaling will automatically remove any excess resource capacity so you avoid overspending. AWS Auto Scaling is free to use, and allows you to optimize the costs of your AWS environment.
... Learn more
ROI-calculator
Configurator
ROI-
-
15
12
logo
Offer a reference bonus
0.00

AWS Cloud​Formation

AWS CloudFormation provides a common language for you to describe and provision all the infrastructure resources in your cloud environment. CloudFormation allows you to use a simple text file to model and provision, in an automated and secure manner, all the resources needed for your applications across all regions and accounts. This file serves as the single source of truth for your cloud environment.  AWS CloudFormation is available at no additional charge, and you pay only for the AWS resources needed to run your applications.

Benefits

MODEL IT ALL AWS CloudFormation allows you to model your entire infrastructure in a text file. This template becomes the single source of truth for your infrastructure. This helps you to standardize infrastructure components used across your organization, enabling configuration compliance and faster troubleshooting. AUTOMATE AND DEPLOY AWS CloudFormation provisions your resources in a safe, repeatable manner, allowing you to build and rebuild your infrastructure and applications, without having to perform manual actions or write custom scripts. CloudFormation takes care of determining the right operations to perform when managing your stack, and rolls back changes automatically if errors are detected. IT'S JUST CODE Codifying your infrastructure allows you to treat your infrastructure as just code. You can author it with any code editor, check it into a version control system, and review the files with team members before deploying into production.
... Learn more
ROI-calculator
Configurator
ROI-
-
5
7
logo
Offer a reference bonus
2.00

Azure Data Warehouse

Create a single center for all your data, be it structured, unstructured or streaming data. Provide work of such transformational decisions, as functions of business analytics, reports, the expanded analytics and analytics in real time. To easily get started, take advantage of the performance, flexibility, and security of Azure's fully managed services, such as SQL Azure and Azure Databricks.

Get rid of worries


Built-in advanced security features include transparent data encryption, auditing, threat detection, integration with Azure Active Directory and virtual network endpoints. Azure services correspond to more than 50 industry and geographic certifications and are available worldwide in 42 regions to store your data wherever your users are located. Finally, Microsoft offers financially secured service level agreements to spare you any hassle.

... Learn more
ROI-calculator
Configurator
ROI-
-
19
2
logo
Offer a reference bonus
2.00

Google Cloud Storage

Geo-redundant storage with the highest level of availability and performance is ideal for low-latency, high QPS content serving to users distributed across geographic regions. Google Cloud Storage provides the availability and throughput needed to stream audio or video directly to apps or websites. The highest level of availability and performance within a single region is ideal for compute, analytics, and ML workloads in a particular region. Cloud Storage is also strongly consistent, giving you confidence and accuracy in analytics workloads. Cloud Storage provides fast, low-cost, and highly durable storage for data accessed less than once a month. Perfect for reducing the cost of backups and archives while still retaining immediate access. Backup data in Cloud Storage could be used for more than just recovery because all Cloud Storage classes have ms latency and are accessed through a single API.

... Learn more
-
ROI-calculator
Configurator
ROI-
-
20
16
logo
Offer a reference bonus
2.00

IBM Cloud Object Storage

IBM Cloud Object Storage is

a highly scalable cloud storage service, designed for high durability, resiliency and security. Designed for data durability of 99.999999999 percent. Data is sliced, and slices are dispersed across multiple devices in multiple facilities for resiliency. High data durability is maintained by built-in integrity checking and self-repair capabilities. Data at rest is secured using server-side encryption and data in motion is secured using carrier-grade TLS/SSL. Gain additional control with role-based policies and use IBM Cloud Identity & Access Management to set bucket-level permissions.

... Learn more
ROI-calculator
Configurator
ROI-
-
8
2
logo
Offer a reference bonus
2.00

Oracle Cloud Storage

Oracle Cloud Infrastructure provides data storage options for a wide spectrum of applications from small websites to the most demanding enterprise applications.

When the ultimate in performance is required, local NVMe SSD’s provide extreme storage performance for VM’s and bare metal compute instances. Examples include relational databases, data warehousing, big data, analytics, AI and HPC applications.
High performance, persistent storage for a wide range of application workloads. Block volumes can scale to 512 TB per compute instance. Typical workloads include NoSQL databases, Hadoop/HDFS applications, IoT and eCommerce applications.
Easy to implement file-system that can be shared across many applications from all operating systems. Start small and scale as data grows. Perfect for migration of on-premises applications, media management, content management, and web applications. 
... Learn more
ROI-calculator
Configurator
ROI-
-
5
5

The ROI4CIO Product Catalog is a database of business software, hardware, and IT services. Using filters, select IT products by category, supplier or vendor, business tasks, problems, availability of ROI calculator or price calculator. Find the right business solutions by using a neural network search based on the results of deployment products in other companies.