HTTP continues to evolve
HTTP (Hypertext Transfer Protocol) is one of the most widely used application protocols on the Internet. Since its publication, RFC 2616 (HTTP 1.1) has served as a foundation for the unprecedented growth of the Internet: billions of devices of all shapes and sizes, from desktop computers to the tiny Web devices in our pockets, speak HTTP every day to deliver news, video, and millions of other Web applications we have all come to depend on in our everyday lives.
> Making the Web Faster with HTTP 2.0
Improving Performance on the Internet
High Performance Web Sites
How Fast is Your Web Site?
The increasing significance of intermediate representations in compilers
Program compilation is a complicated process. A compiler is a software program that translates a high-level source language program into a form ready to execute on a computer. Early in the evolution of compilers, designers introduced IRs (intermediate representations, also commonly called intermediate languages) to manage the complexity of the compilation process. The use of an IR as the compiler’s internal representation of the program enables the compiler to be broken up into multiple phases and components, thus benefiting from modularity.
> Intermediate Representation
All Your Database Are Belong to Us
Stream Processors: Progammability and Efficiency
Software Development with Code Maps
Interfacing between languages is increasingly important
Interoperability between languages has been a problem since the second programming language was invented. Solutions have ranged from language-independent object models such as COM (Component Object Model) and CORBA (Common Object Request Broker Architecture) to VMs (virtual machines) designed to integrate languages, such as the JVM (Java Virtual Machine) and CLR (Common Language Runtime). With software becoming ever more complex and hardware less homogeneous, the likelihood of a single language being the correct tool for an entire program is lower than ever. As modern compilers become more modular, there is potential for a new generation of interesting solutions.
> The Challenge of Cross-language Interoperability
The Robustness Principle Reconsidered
Erlang for Concurrent Programming
The Rise and Fall of CORBA
It’s not always size that matters.
I’ve been dealing with a large program written in Java that seems to spend most of its time asking me to restart it because it has run out of memory. I’m not sure if this is an issue in the JVM (Java Virtual Machine) I’m using or in the program itself, but during these frequent restarts, I keep wondering why this program is so incredibly bloated. I would have thought Java’s garbage collector would prevent programs from running out of memory, especially when my desktop has quite a lot of it. It seems that eight gigabytes just isn’t enough to handle a modern IDE anymore.
Lack of RAM
> Bugs and Bragging Rights
Combining agile and SEMAT yields more advantages than either one alone
IVAR JACOBSON, IAN SPENCE, AND PAN-WEI NG
Today, as always, many different initiatives are under way to improve the ways in which software is developed. The most popular and prevalent of these is the agile movement. One of the newer kids on the block is the SEMAT (Software Engineering Method and Theory) initiative. As with any new initiative, people are struggling to see how it fits into the world and relates to all the other things going on. For example, does it improve or replace their current ways of working? Is it like lean, which supports and furthers the aims of the agile movement, or is it more like waterfall planning, which is in opposition to an agile approach?
> Agile and SEMAT — Perfect Partners
Breaking the Major Release Habit
UX Design and Agile: A Natural Fit?
The Essence of Software Engineering: The SEMAT Kernel
Merging the art and science of software development
Software life-cycle management was, for a very long time, a controlled exercise. The duration of product design, development, and support was predictable enough that companies and their employees scheduled their finances, vacations, surgeries, and mergers around product releases. When developers were busy, QA (quality assurance) had it easy. As the coding portion of a release cycle came to a close, QA took over while support ramped up. Then when the product released, the development staff exhaled, rested, and started the loop again while the support staff transitioned to busily supporting the new product.
> Adopting DevOps Practices in Quality Assurance
Traipsing Through the QA Tools Desert
The Antifragile Organization
Quality Assurance: Much More than Testing
A close look at RTT measurements with TCP
STEPHEN D. STROWES, BOUNDARY INC.
Measuring and monitoring network RTT (round-trip time) is important for multiple reasons: it allows network operators and end users to understand their network performance and help optimize their environment, and it helps businesses understand the responsiveness of their services to sections of their user base.
> Passively Measuring TCP Round-trip Times
You Don’t Know Jack about Network Performance
Bufferbloat: Dark Buffers in the Internet
TCP Offload to the Rescue
Eliminating memory hogs
A space leak occurs when a computer program uses more memory than necessary. In contrast to memory leaks, where the leaked memory is never released, the memory consumed by a space leak is released, but later than expected. This article presents example space leaks and how to spot and eliminate them.
> Leaking Space
NUMA (Non-Uniform Memory Access): An Overview
Software Transactional Memory: Why Is It Only a Research Toy?
You Don’t Know Jack about Shared Variables or Memory Models
High-frequency Trading and Exchange Technology
I am a former high-frequency trader. For a few wonderful years I led a group of brilliant engineers and mathematicians, and together we traded in the electronic marketplaces and pushed systems to the edge of their capability.
> Barbarians at the Gateways
The challenges faced by competing HFT algorithms
JACOB LOVELESS, SASHA STOIKOV, AND ROLF WAEBER
HFT (high-frequency trading) has emerged as a powerful force in modern financial markets. Only 20 years ago, most of the trading volume occurred in exchanges such as the New York Stock Exchange, where humans dressed in brightly colored outfits would gesticulate and scream their trading intentions. Nowadays, trading occurs mostly in electronic servers in data centers, where computers communicate their trading intentions through network messages. This transition from physical exchanges to electronic platforms has been particularly profitable for HFT firms, which invested heavily in the infrastructure of this new environment.
> Online Algorithms in High-frequency Trading