Anxiously awaiting the arrival of all-optical computing? Don't hold your breath.
Leveraging technology to support aging relatives in their homes is a cost-efficient way to maintain health and happiness and extend life. As the technology expert for my extended family, it has fallen to me to architect the infrastructure that will support my family’s aging loved ones in their homes as long as possible. Over the years, I have assisted four different senior households in achieving this goal, and although things have been bumpy at times, I have refined technical solutions and methodologies that seem to work well.
We hear it all the time. The cost of disk space is plummeting.
There is no magic and the lessons of the past apply just as well today.
A good idea, but it can be taken too far
Is it true that politics and technology don't mix?
Most people I know run wireless networks in their homes. Not me. I hardwired my home and leave the Wi-Fi turned off. My feeling is to do it once, do it right, and then forget about it. I want a low-cost network infrastructure with guaranteed availability, bandwidth, and security. If these attributes are important to you, Wi-Fi alone is probably not going to cut it.
SOA is no more a silver bullet than the approaches which preceded it. Back in ancient times, say, around the mid '80s when I was a grad student, distributed systems research was in its heyday. Systems like Trellis/Owl and Eden/Emerald were exploring issues in object-oriented language design, persistence, and distributed computing. One of the big themes to come out of that time period was 'location transparency', the idea that the way that you access an object should be independent of where it is located. That is, it shouldn't matter whether an object is in the same process, on the same machine in a different process, or on another machine altogether.
Project managers love him, recent software engineering graduates bow to him, and he inspires code warriors deep in the development trenches to wonder if a technology time warp may have passed them by. How can it be that no one else has ever proposed software development with the simplicity, innovation, and automation being trumpeted by Architect Tom? His ideas sound so space-age, so futuristic, but why should that be so surprising? After all, Tom is an architecture astronaut!
SIP does a great job as a helicopter, but when you try to make it function as an IM submarine as well, disaster may follow.
A recent conversation about development methodologies turned to the relative value of various artifacts produced during the development process, and the person I was talking with said: the code has "always been the only artifact that matters. It's just that we're only now coming to recognize that." My reaction to this, not expressed at that time, was twofold. First, I got quite a sense of déjà-vu since it hearkened back to my time as an undergraduate and memories of many heated discussions about whether code was self-documenting.
With 1 TB of RAID 5 storage, most of my friends believe I have really gone off the deep end with my home server. They may be right, but as in most things in life, I have gotten to this point through a rational set of individual upgrades all perfectly reasonable at the time. Rather than being overly indulgent to my inner geek, am I an early adopter of what will be the inevitable standard for home IT infrastructure? Here is my story; you be the judge.
New uses for small form-factor, low-power machines
Risk is a necessary consequence of dependence
Common wisdom has it that enterprises need firewalls to secure their networks.
Stand-up meetings are an important component of the 'whole team', which is one of the fundamental practices of extreme programming (XP).
And you think you have problems?
An epigram: "We may not feel these limitations until they have been lifted from us, just as we often do not know we are sick until we suddenly feel better. Therefore, it is reasonable to expect that future languages will make us feel those limitations of [our present environments] that are not detectable today." --Gerald Weinberg
Ada remains the Rodney Dangerfield of computer programming languages, getting little respect despite a solid technical rationale for its existence. Originally pressed into service by the U.S. Department of Defense in the late 1970s, these days Ada is just considered a remnant of bloated military engineering practices.
How many of us have not had the experience of sitting in a classroom wondering idly: "Is this really going to matter out in the real world?" It's curious, and in no small amount humbling, to realize how many of those nuggets of knowledge really do matter. One cropped up recently for me: the Finite State Machine (FSM). As we continue to develop the new UI for our product, we'll definitely be using FSMs wherever possible.
The status quo prevails in interface design, and the flawed concept of cut-and-paste is a perfect example.