6.2 Distributing Applications to the Edge While the application transport system is able to speedup communications over the wide-area Internet, the ultimate boost in performance, reliability, and scalability comes when the application itself can be distributed to the edge. Akamai introduced such capabilities on its platform nearly a decade ago with its EdgeComputing™ services, which include the capability for companies to deploy and execute request-driven Java J2EE applications or application components onto Akamai‘s edge servers. Akamai EdgeComputing takes cloud computing to a level where application resources are allocated not only on-demand but also near the end user. The latter piece (i.e., proximity near the end user) is critical to performance yet still missing from most cloud computing services today. Implementing a platform capable of EdgeComputing services requires overcoming a number of interesting technical challenges, including session management (and replication across machines, security sandboxing, fault management, distributed load- balancing, and resource monitoring and management, as well as providing the appropriate testing and deployment tools. Akamai‘s approach to these issues and general implementation are covered in some detail in [15], so we refer the interested reader there. Not all types of applications can run entirely on the edge those that rely heavily on large transactional databases will often require significant communication with origin infrastructure. However, many types of applications (or portions of applications) can from the receiver, although pipelined HTTP requests make it possible to see additional benefit from compression of even the smallest of content. benefit significantly from EdgeComputing. We summarize some categories of use cases from [15]: