On repositories of patches and tension between security professionals and in-house developers
Probabilistic algorithms are all around us--not only are they acceptable, but some programmers actually seek out chances to use them.
The "Leftover Principle" Requires Increasingly More Highly-skilled Humans.
Catering to developers' strengths while still meeting team objectives
Branching out and changing with the times at acmqueue
On null encryption and automated documentation
No one expects the Spanish Acquisition.
Lessons learned managing a data science research team
An agile process implementation
Relevance and repeatability
What happened to the promise of rigorous, disciplined, professional practices for software development?
Soon every company will be a software company.
How do you, the reader, stay informed about research that influences your work?
What do you do when your debugger fails you?
Shortchanged by open source
Becoming better, faster, cheaper, and happier
And the illogic of PDF
The meaning of bits and avoiding upgrade bog downs
Dante's tale, as experienced by a software architect
It's not always size that matters.
Combining agile and SEMAT yields more advantages than either one alone
High-frequency Trading and Exchange Technology
The challenges faced by competing HFT algorithms
Also, the perils of premature rebooting
Software is supposed be a part of computer science, and science demands proof.
Whenever someone asks you to trust them, don't.
Is there a best used-by date for software?
One programmer's extension is another programmer's abuse.
A thinking framework in the form of an actionable kernel
Quality happens only when someone is responsible for it.
The bytes you save today may bite you tomorrow
Colorful metaphors and properly reusing functions
Until our programming languages catch up, code will be full of horrors
Stopping to smell the code before wasting time reentering configuration data
Shortcuts that save money and time today can cost you down the road.
Using a tool for the wrong job is OK until the day when it isn't.
KV hates unnecessary work.
What separates good code from great code?
Keep your debug messages clear, useful, and not annoying.
Did Ken, Dennis, and Brian choose wrong with NUL-terminated text strings?
As more and more systems care about time at the second and sub-second level, finding a lasting solution to the leap seconds problem is becoming increasingly urgent.
Using tools such as Automake and Autoconf with preexisting code bases can be a major hassle.
Why can't we all use standard libraries for commonly needed algorithms?
A good library is like a garden.
Component models can help diagnose architectural problems in both new and existing systems.
Debugging an ephemeral problem
Gathering statistics is important, but so is making them available to others.
Overspecialization can be the kiss of death for sysadmins.
Emulating a video system shows how even a simple interface can be more complex—and capable—than it appears.
An essential technique used in emulator development is a useful addition to any programmer's toolbox.
Frequent broken builds could be symptomatic of deeper problems within a development project.
When is the right time to commit changes?
Integrating changes in branched development
Long considered an afterthought, software maintenance is easiest and most effective when built into a system from the ground up.
Whether distributed or centralized, all revision-control systems come with complicated sets of tradeoffs. How do you find the best match between tool and team?
It has been five years since I first wrote about code spelunking, and though systems continue to grow in size and scope, the tools we use to understand those systems are not growing at the same rate. In fact, I believe we are steadily losing ground. So why should we go over the same ground again? Is this subject important enough to warrant two articles in five years? I believe it is.
A koder with attitude, KV answers your questions. Miss Manners he ain't.
A koder with attitude, KV answers your questions. Miss Manners he ain't.
A recent conversation about development methodologies turned to the relative value of various artifacts produced during the development process, and the person I was talking with said: the code has "always been the only artifact that matters. It's just that we're only now coming to recognize that." My reaction to this, not expressed at that time, was twofold. First, I got quite a sense of déjà-vu since it hearkened back to my time as an undergraduate and memories of many heated discussions about whether code was self-documenting.
The C/C++ Solution Manager at Parasoft explains how infrastructure elements allow development teams to increase productivity without restricting creativity.
Jeff Johnstone of TechExcel explains why there is a need for a new approach to application lifecycle management that better reflects the business requirements and challenges facing development teams.
Kode Vicious is hungry. He sustains himself on your questions from the software development trenches (and lots of beer). Without your monthly missives, KV is like a fish out of water, or a scientist without a problem to solve. So please, do you part to keep him sane (or at least free from psychotic episodes), occupied, and useful.
Why changing APIs might become a criminal offense. After more than 25 years as a software engineer, I still find myself underestimating the time it will take to complete a particular programming task. Sometimes, the resulting schedule slip is caused by my own shortcomings: as I dig into a problem, I simply discover that it is a lot harder than I initially thought, so the problem takes longer to solvesuch is life as a programmer. Just as often I know exactly what I want to achieve and how to achieve it, but it still takes far longer than anticipated. When that happens, it is usually because I am struggling with an API that seems to do its level best to throw rocks in my path and make my life difficult.
Errors, deceptions, and abmiguity
Chasing citations through endless, mislabeled nodes
Can agile development make your team more productive?
A look inside and extensible plug-in architecture ECLIPSE is both an open, extensible development environment for building software and an open, extensible application framework upon which software can be built. Considered the most popular Java IDE, it provides a common UI model for working with tools and promotes rapid development of modular features based on a plug-in component model. The Eclipse Foundation designed the platform to run natively on multiple operating systems, including Macintosh, Windows, and Linux, providing robust integration with each and providing rich clients that support the GUI interactions everyone is familiar with: drag and drop, cut and paste (clipboard), navigation, and customization.
Dear KV, I'm on a small team that is building a custom, embedded, consumer device that is due out by Christmas. Of course the schedule is tight and there are make-or-break dates that if we miss basically mean the product will never make it to market. Not the most fun environment in which to have problems. The software was carefully specified and laid out and then simulated while the hardware was being manufactured. Now we have real hardware, and real problems as well. Aside from the timing issues we found when we were no longer running the software in a simulator, several bugs remain that show up only under very special circumstances and that disappear when I use the debugger or turn on the logging code built into the system.
Despite the considerable effort invested by industry and academia in modeling standards such as UML (Unified Modeling Language), software modeling has long played a subordinate role in commercial software development. Although modeling is generally perceived as state of the art and thus as something that ought to be done, its appreciation seems to pale along with the progression from the early, more conceptual phases of a software project to those where the actual handcrafting is done.
There are not many names bigger than Ray Ozzie's in computer programming. An industry visionary and pioneer in computer-supported cooperative work, he began his career as an electrical engineer but fairly quickly got into computer science and programming. He is the creator of IBM's Lotus Notes and is now chief technical officer of Microsoft, reporting to chief software architect Bill Gates. Recently, Ozzie's role as chief technical officer expanded as he assumed responsibility for the company's software-based services strategy across its three major divisions.
The problem? Computers make it too easy to copy data.
Not only does California give you plenty of sun, it also apparently has employers that give you plenty of time to play around with the smaller problems that you like, in a programming language that's irrelevant to the later implementation.
Please allow me the pleasure of leading you on an 'office safari', so to speak. On today's journey we'll travel the corridors of computerdom in search of the widespread but elusive mal managerium, or bad manager, in common parlance. They will be difficult to spot because we will be in a sense looking for that most elusive creature of all: ourselves. That is to say, it's quite possible that many of us will share some of the qualities with the various types of bad managers we shall encounter. Qualities that we are loath to admit we possess, I might add.
Dear KV, My co-workers keep doing really bad things in the code, such as writing C++ code with macros that have gotos that jump out of them, and using assert in lower-level functions as an error-handling facility. I keep trying to get them to stop doing these things, but the standard response I get is, "Yeah, it's not pretty, but it works." How can I get them to start asking, "Is there a better way to do this?" They listen to my arguments but don't seem convinced. In some cases they even insist they are following good practices.
Dear KV, I'm maintaining some C code at work that is driving me right out of my mind. It seems I cannot go more than three lines in any file without coming across a chunk of code that is conditionally compiled.
The thorough use of internal documentation is one of the most-overlooked ways of improving software quality and speeding implementation.
The program should be a small project, but every time I start specifying the objects and methods it seems to grow to a huge size, both in the number of lines and the size of the final program.
Dear KV, My officemate writes methods that are 1,000 lines long and claims they are easier to understand than if they were broken down into a smaller set of methods. How can we convince him his code is a maintenance nightmare?
Dear KV, Whenever my team reviews my code, they always complain that I don't check for return values from system calls. I can see having to check a regular function call, because I don't trust my co-workers, but system calls are written by people who know what they're doing--and, besides, if a system call fails, there isn't much I can do to recover. Why bother?
Dear Kode Vicious, I have this problem. I can never seem to find bits of code I know I wrote. This isn't so much work code--that's on our source server--but you know, those bits of test code I wrote last month, I can never find them. How do you deal with this?
And you think you have problems?
What does punctuation have to do with software development?
Hot scientific tools trickle down to support mainstream IT tasks.
What's wrong with taking our profession a little more seriously?
In the PC and server worlds, the engineering battle for computer performance has often focused on the hardware advances Intel brings to its microprocessors.
A potentially deadly illness, clinically referred to as UML (Unified Modeling Language) fever, is plaguing many software-engineering efforts today. This fever has many different strains that vary in levels of lethality and contagion. A number of these strains are symptomatically related, however. Rigorous laboratory analysis has revealed that each is unique in origin and makeup. A particularly insidious characteristic of UML fever, common to most of its assorted strains, is the difficulty individuals and organizations have in self-diagnosing the affliction. A consequence is that many cases of the fever go untreated and often evolve into more complex and lethal strains.
Rumors of the demise of the Waterfall Life-cycle Model are greatly exaggerated. We discovered this and other disappointing indicators about current software engineering practices in a recent survey of almost 200 software professionals. These discoveries raise questions about perception versus reality with respect to the nature of software engineers, software engineering practice, and the industry.
Forty years ago, when computer programming was an individual experience, the need for easily readable code wasn't on any priority list. Today, however, programming usually is a team-based activity, and writing code that others can easily decipher has become a necessity. Creating and developing readable code is not as easy as it sounds.
Remember the halcyon days when development required only a text editor, a compiler, and some sort of debugger (in cases where the odd printf() or two alone didn't serve)? During the early days of computing, these were independent tools used iteratively in development's golden circle. Somewhere along the way we realized that a closer integration of these tools could expedite the development process. Thus was born the integrated development environment (IDE), a framework and user environment for software development that's actually a toolkit of instruments essential to software creation. At first, IDEs simply connected the big three (editor, compiler, and debugger), but nowadays most go well beyond those minimum requirements.
Welcome to my first installment of ACM Queue's ToolKit column. Each issue I'll dig beneath the market-friendly, feature-rich exterior of some of the best-known (and some of the least-known) development tools in an attempt to separate the core app from the product spec sheet.
Stand-up meetings are an important component of the 'whole team', which is one of the fundamental practices of extreme programming (XP).
The discipline, science, and art of interface design has gone stagnant. The most widely read books on the subject are primarily compendia of how to make the best of received widgets. The status quo is mistaken for necessity. Constrained in this chamber pot, designers wander around giving the users of their products little comfort or fresh air.
As part of this issue on programmer tools, we at Queue decided to conduct an informal Web poll on the topic of debugging. We asked you to tell us about the tools that you use and how you use them. We also collected stories about those hard-to-track-down bugs that sometimes make us think of taking up another profession.
Typical software development involves one of two processes: the creation of new software to fit particular requirements or the modification (maintenance) of old software to fix problems or fit new requirements. These transformations happen at the source-code level. But what if the problem is not the maintenance of old software but the need to create a functional duplicate of the original? And what if the source code is no longer available?
Cool tools are seductive. When we think about software productivity, tools naturally come to mind. When we see pretty new tools, we tend to believe that their amazing features will help us get our work done much faster. Because every software engineer uses software productivity tools daily, and all team managers have to decide which tools their members will use, the latest and greatest look appealing.
An epigram: "We may not feel these limitations until they have been lifted from us, just as we often do not know we are sick until we suddenly feel better. Therefore, it is reasonable to expect that future languages will make us feel those limitations of [our present environments] that are not detectable today." --Gerald Weinberg