Showing posts with label Elastic Compute Cloud. Show all posts
Showing posts with label Elastic Compute Cloud. Show all posts
Thursday, January 3, 2008
Terabyte PC Coming
It's just a data point, but note that Asus, the Taiwanese computer maker, is planning on bringing to market a notebook PC with two 500 GByte hard drives from Hitachi Global Storage Technologies.
That's a terabyte. Those of you familiar with enterprise storage, think about it: a terabyte per user. Those of you who have to do your own backups, think about it: losing a significant portion of 1 Tbyte of data if your hard disks crash.
The upside is that such a user can 1,000 hours of video, or more than 350 feature length movies, or 250,000 four-minute songs. The downside? If those files are not backed up someplace, huge collections of audio or video can vanish.
The point is that storage continues to emerge as a function that is becoming harder to manage. It is harder to backup, harder to restore, harder to secure, index and retrieve. Part of the reason is that simply is so much more information to store. This graphic from searchstorage.com simply makes the point that storage and backup requirements grow steadily.
Which makes the argument for storage in the cloud ever more compelling. If one's authorized copies of music, video or other material are stored in the cloud, local hard drives can crash with little threat of losing the content. Not to mention that the files can be used on any number of endpoints (I didn't say downloaded to those endpoints).
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Saturday, December 22, 2007
IBM Blue Cloud: Internet Style Data Centers
IBM’s Blue Cloud is a platform for cloud-based computing, expected to be available to customers in the spring of 2008, supporting systems with Power and x86 processors.
“Blue Cloud" will allow corporate data centers to operate more like the Internet, enabling computing across a distributed, globally accessible fabric of resources, rather than on local machines or remote server farms.
It is, along with Amazon's Elastic Compute Cloud, a seminal step towards network-based computing architectures. Sun Microsystems was ahead of its time in declaring that the "network is the computer." But cloud computing is going to fulfill the prediction.
Call it "software as a service" if you like. The point is that we are nearing an era where resources will be invoked from the computing cloud using a Web browser. Policies still will be needed to authorize use of specific resources, to be sure. But the larger point is that computing, storage and application resources will reside "in the cloud," and be invoked as required by users at the edge of the cloud.
There are all sorts of practical advantages. Distributed or mobile workers can simply invoke their services and information from where they are, using a standard Web browser. Everyone always will have the latest version, the latest patch, the latest version or update.
Computationally intense activities can be handled by clusters of machines designed for such intensity. Storage can be invoked, not carried; used rather than built.
If a developer needs expensive resources, they can be gotten on a sort of "time shared" basis, rather than on a "build your own computing center" basis.
Blue Cloud will be based on open standards and open source software supported by IBM software, systems technology and services.
The interesting speculation is about how cloud computing might change the way enterprises think about their application and storage architectures. Given the massive increase in the scale of IT environments, one wonders how they'll assess the trade-offs between "building data centers" and "renting reources."
Up to this point, the enterprise data center has been the penultimate computing resource. Might the "cloud" surpass even local and networked data centers?
Labels:
Amazon,
Blue Cloud,
cloud computing,
Elastic Compute Cloud,
IBM
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Wednesday, December 19, 2007
Amazon DevPay: Getting Paid for Cloud Apps
Amazon DevPay is a simple-to-use billing and account management service that makes it easy for developers to get paid for applications they build on Amazon Web Services.
Amazon DevPay allows app providers to quickly sign up customers, automatically meter their usage of services, have Amazon bill users, and collect payments.
Amazon DevPay provides a simple Web interface for pricing applications based on any combination of up-front, recurring and usage-based fees.
To use Amazon DevPay, users develop using Amazon S3 or an Amazon EC2 Machine Image (AMI), register the apps with Amazon DevPay, provide a product description and configure your desired pricing.
The Amazon DevPay purchase pipeline is linked to the app Web site. Activity is
monitored on the Amazon DevPay Activity page.
There are no minimum fees and no setup charges. Activity is billed at three percent of the transaction amounts and $0.30 per bill generated.
Labels:
Amazon,
cloud computing,
DevPay,
Elastic Compute Cloud
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Amazon SimpleDB: Boost for Cloud Computing
Amazon now offers SimpleDB, a Web service for running queries on structured data in real time. This service works in close conjunction with Amazon Simple Storage Service (Amazon S3) and Amazon Elastic Compute Cloud (Amazon EC2), collectively providing the ability to store, process and query data sets in the cloud.
Traditionally, this type of functionality has been accomplished with a clustered relational database that requires a sizable upfront investment. In contrast, Amazon SimpleDB is easy to use and provides the core functionality of a database--real-time lookup and simple querying of structured data--without the operational complexity.
Amazon SimpleDB automatically indexes data and provides a simple API for storage and access.
Amazon SimpleDB provides streamlined access to the lookup and query functions that traditionally are achieved using a relational database cluster, while leaving out other complex, often-unused database operations.
Amazon SimpleDB allows easy scaling of applications as well. For the Beta release, a single domain is limited in size to 10 gigabytes and 100 domains. Over time these limits may be raised, Amazon says.
The service runs within Amazon's high-availability data centers and fully indexed user data is stored redundantly across multiple servers and data centers.
Amazon SimpleDB is designed to integrate easily with other web-scale services such as Amazon EC2 and Amazon S3. For example, developers can run their applications in Amazon EC2 and store their data objects in Amazon S3. Amazon SimpleDB can then be used to query the object metadata from within the application in Amazon EC2 and return pointers to the objects stored in Amazon S3.
Developers and users pay only for what they use; there are no minimum fees.
Machine use costs $0.14 per Amazon SimpleDB Machine Hour consumed. Data transfer in
$0.10 per gigabyte. Data transfer out varies based on volume. Costs are $0.18 per GB for the first 10 TB per month; $0.16 per GB for the next 40 TB and $0.13 per GB over 50 TB.
Structured data storage costs $1.50 per GB-month.
The point is that it is becoming easier by the day to create, store and execute applications based entirely "in the cloud," without ownership or lease of data facilities, access pipes or servers to support those apps. At some point, highly-distributed workforces or end user bases will find it congenial in the extreme to support remote users with services always available through a standard Web browser, with the latest version, with no need for loading updates, patches or extensions.
As software becomes a service, computing infrastructure also is becoming a utility or service as well.
Labels:
Amazon,
cloud computing,
Elastic Compute Cloud,
SaaS
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...