Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

Pervasive Computing Systems 88

nickynicky9doors writes "Washington Technology has an article on Smart Conference Rooms. 'Pervasive-computing systems ...will come about through large numbers of small devices and sensors, some so unobtrusive that people won't know they're interacting with a computer at all.' The Smart Flow System was designed with open-source middle ware and the data acquistion system is based on a Linux cluster of 14 computers."
This discussion has been archived. No new comments can be posted.

Pervasive Computing Systems

Comments Filter:
  • by envelope ( 317893 ) on Tuesday February 19, 2002 @11:36AM (#3032046) Homepage Journal
    Sounds like with the microphone array it will be able to hear stomachs growling and go ahead and order lunch for everyone.
  • by drew_kime ( 303965 ) on Tuesday February 19, 2002 @11:36AM (#3032048) Journal
    Aww, crap, it really is a cluster. Well that's not very funny, now is it?
    • Narrowly focusing your attention on a Beowulf cluster of Linux computers totally misses the point and ignores the real meaning of Ubiquitous Computing, Calm Technology, or Pervasive Computing as it's being called these days.

      The trivial, uninteresting detail that the system is currently implemented by a "dramatic" machine, a Beowulf cluster of Linux computers, hidden away in a server room somewhere out of sight, is the least important thing about the research, and totally misses the point.

      But it's just fashionible to mention Linux in a newspaper article like that, to wind up the anti-microsoft kids, so you get slashdotted with lots of free publicity. Otherwise, slashdot would never carry an article about Ubiquitous Computing that didn't mention Linux.

      Mark Weiser wrote the following definition of Ubiquitous Computing [ubiq.com] in 1988:

      For thirty years most interface design, and most computer design, has been headed down the path of the "dramatic" machine. Its highest ideal is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the "invisible"; its highest ideal is to make a computer so imbedded, so fitting, so natural, that we use it without even thinking about it. (I have also called this notion "Ubiquitous Computing", and have placed its origins in post-modernism.) I believe that in the next twenty years the second path will come to dominate. But this will not be easy; very little of our current systems infrastructure will survive. We have been building versions of the infrastructure-to-come at PARC for the past four years, in the form of inch-, foot-, and yard-sized computers we call Tabs, Pads, and Boards. Our prototypes have sometimes succeeded, but more often failed to be invisible. From what we have learned, we are now explorting some new directions for ubicomp, including the famous "dangling string" display.

      -Mark Weiser

      ====

      -Don

  • by crumbz ( 41803 ) <[<remove_spam>ju ... spam>gmail.com]> on Tuesday February 19, 2002 @11:38AM (#3032060) Homepage
    I like the secured room in Nueromancer (Count Zero? I forget) that you have to pay by the minute for complete privacy. No possibility of listening devices or transmitters. A complete 180 from this idea. Everyday privacy is being drawn and quartered. Granted a conference room might not seem to be the most secure place to discuss a sensitive topic, but a voice recognizing cluster of Linux boxes? Give me a break. What is to stop employers from deploying this technology throughout the office space in the name of security? See why 2002 will be like 1984.
  • I think the biggest problem with systems like this is trusting the computers to Do The Right Thing(TM).
    • Re:Trust? (Score:2, Informative)

      by delta407 ( 518868 )

      AT&T labs have produced something they call Sentient Computing [att.com], and while technically it may not be sentient, it would probably be kind of creepy. Each person wears a "bat" which lets a central computer monitor their whereabouts, and based on information fed to it by various sensors, can deduce what the person is doing.

      This is a neat system; you can point at things with the bat and the computer will respond (like pointing the bat at a poster to choose scanner settings), however, since this computer is tied into the phone system (among other things), this could get kind of scary.

      • >This is a neat system; you can point at
        >things with the bat and the computer will respond

        if i mean to point at the poster, but accidentally point at the free poster frame from double-bat-point-click.com, do i have to suffer through a commercial jingle from at&t?

        just as users of the www have had to learn to ignore banner ads, close pop-ups, etc., so users of this technology will have to learn to ignore gibbering, day-glo beer ads.
      • however, since this computer is tied into the phone system (among other things), this could get kind of scary.

        Pointy-haired Boss: Darnit! We're trapped in here! The door is stuck and the ventilation system isn't working! Computer, contact security and have them send someone to open the door.

        Computer: I'm sorry Dave. I can't do that.

        Seriously, though. I, for one, would be hesitant to have a meeting in a room that *might* be recording everything I say. I would prefer a visible dictation device that sits on the table where I can see it working and provides me with visual feedback of the dictation. If they could just improve speech recognition technology, that would be enough of a leap forward for me. Forget the rest of the crap.

  • by russianspy ( 523929 ) on Tuesday February 19, 2002 @11:39AM (#3032070)
    Technology is great, but sometimes we all need a bit of peace and quiet. Am I the only one who actually unpluggs his phone, set the answering machine to silent and simpy read a book in peace? I find that haveing too many gadgets all around you makes for a stressfull time, something is always beeping at you. It's like haveing 5 kids continously asking for attention.
    I propose a petition. There should be one room in every house without ANY computers, telephones, or other devices that need "attention" of any kind. Keep computers out of my bathroom!
  • I like the potential to not need to manually control the cameras when video-conferencing. I just love it when the speaker on the other end has happy feet and you wind up wasting time trying to keep him in frame ;)

    I also like the fact that the guts of the project are being released to the public so we can all play. Not like I have the equipment to set this up, but it's nice to know I could if I wanted to.

  • Suddenly gets another meaning... while your in the boardroom easily ordering cups of coffee, getting everyones taste from their PDA's while continuing to blab about next quarter's profits...

    Ha... the moment someone walks in the boardroom their PDA's sends electronic businesscards to evryone else PDA. Cell phones switch off automatically or in the very least switch to vibrate instead of your favorite mp3. Appointments spoken out in the room will be recorded and an agenda entry will be made...

    Could see uses for this in the companies.. can off course be hacked in such a way you can read out everyone's agenda to see if their next meeting will be with your competitor or not..
  • by jweb ( 520801 ) <(jweb68) (at) (hotmail.com)> on Tuesday February 19, 2002 @11:41AM (#3032089)
    On the surface, this may seem like a Good Thing(tm), but think for a second: Do you really want your walls recording everything you do, everything you say over the course of a day?
    • Richard Nixon learned that lesson quite well ;-)
    • I know I may be setting myself up here but is it really going to be such an issue, I mean when we start working out how much compute power you need to recognise everyone at work, on the train, bus etc and how much storage it is all going to take and who the hell is going to pay for it all
    • Do you really want your walls recording everything you do, everything you say over the course of a day?

      Fox would probably bring out a "If These Walls Could Talk" series about CEOs and their secretaries.
    • In cases where pervasive computing is really useful, the lab rats [washington.edu] (e.g. underpaid graduate students) will not have a say in the matter.

    • On the surface, this may seem like a Good Thing(tm), but think for a second: Do you really want your walls recording everything you do, everything you say over the course of a day?

      At first this produced a knee-jerk reaction "Of course not!"

      But then I thought further. Weren't most of us taught that there's a "God" watching everything we do, judging us?

      Perhaps pervasive computing is simply the technical term for "God 2.0".

      David Brin had written [kithrup.com] about this many years ago. I tend to agree with him: people will get used to the technology, and deal with it. It'll be somewhat disruptive, but it's necessary -- the Powers That Be want to keep tabs on their equipment (including employees), and the police will want to be able to witness crimes "in action" after-the-fact, so cameras will appear everywhere.

      Given that, it's better for us to maintain control and have cameras literally everywhere, including in the control rooms where the HR lackeys and police and politicians are doing their watching.

      And with nanotechnology, the cameras become particles of dust. With the economies of scale that nanofabrication brings with it, this dust could cover the earth. Storage would become an issue, but again, nanostorage would save the day.

  • sounds like a prelude to asimov's robot fantasies. these things dont walk and talk yet, but he had the right idea with the three laws--otherwise, these things can be used for quite some nasty personal abuse, to say the least.

    QED
  • Phew, this is a huge step towards computer-controlled life ... but, nono i wouldn't be happy if a computer were watching all my steps, who knows what they will use that data for ...! It's ok as long as there is no abuse of the system and it really helps the stressed businessmen/women.

    Anyway, what will happen when the sensors catch someone picking in the nose? Big UUPS :-/
    • Anyway, what will happen when the sensors catch someone picking in the nose?

      Never mind that, imagine that Taco gets one to watch over meetings of the /. editors. What does it do when it catches someone pouring hot grits in their pants, or "doing the goatse"?
  • by Anonymous Coward
    Their motto [motorola.com] is "Things are starting to talk to other things". But how long until the "thing" your PDA/sweater/whatever is talking to is in the hands of a crooked cop? Their "explanation" even shows the police getting evidence from a digital device without a warrant! Read the Fourth and Fifth Amendments to the U.S. Constitution to see where I'm coming from.
  • Herman's team connected a variety of off-the-shelf devices to a prototype meeting room that can take dictation, track individual speakers and, perhaps some day, answer spoken questions.

    I'd like to see this actually work... not the spoken questions part, just the simple dictation. I am doing alot of research into voice recognition software right now, and I have yet to see software that can accurately transcribe free dictation from any given speaker.

    Sure, some software does pretty well if it is only looking for a few specific words. But give it a 60,000 word english dictionary to work with, and it just doesn't cut it. The best I've seen is maybe 70% accuracy, on a good day, in a quiet room, with my face right up to the microphone. 70% does not make for a final document that anyone wants to try and wade through and decipher.

    Anyone can say they support "dictation", but until the accuracy increases, it's like saying you support database transactions because you only corrupt 30% of the data.

    Has anyone out there had better experiences than me? What do you think is the best speech recognition software for free dictation right now? (My vote would be for ViaVoice.)

    • We use Viavoice for linux, and the accuracy is around 85-90% from 3 meters away in a lab with ~50Db of fan noise. The array rules. That's after training Viavoice, of course, and with a speaker that knows how to over-articulate.

      If I were you, I'd suspect the microphone and/or the sound card. You should have better results in the first place.

      OG.
  • Sounds like the kitchen of the future to me...

    You know those things you saw on clips of 40-50's news reels.

    Neat idea, but I have to seiously question the deendece to such a degree on technology, technology is fragile, and a great tool, One sinlgle 100 megaton nuclear device exploded in the atmosphere would end all that for decades.

    EMP weaponry is going to be one of the tools of the future for armies at large, you think a Jumo Jet creates havoc. What would you do tommorow, if all the electronic devices you use were suddenly non functional , Could you survive ?, I could, I have a breaker ignition vehicle in storge, backup mechanical water pumping(on a well) and oil heat with no electronic controls. This isnt the norm, prevasive compution is a cool concept but sounds far too omniporesent and fragile for me.
    • I understand your anti-computer position. It would seem that your keyboard has declared a personal vendetta against you.
      • THAT WAS FUNNY !!!!! really...My spelling is bad, I think I get more comments on that than anything, but I pegged karma cap in 2 weeks...so go figure...

        I think youre right, I hate these damm 'soft touch' jobbers, I used my Original IBM PC , clicker job for as long as I could, Noone makes a mechanical clikc thunk one anymore.

        I am not anticomputer....In the least...I have owned and used a computer since 1978. I make my living with them.

        I am however concerned about a house of cards effect when it comes to my computer interaction. I have physical copies of all documents, banking etc. If, lets say if, all the computers ceased to function tommorow, EMP, Solar Flare, you name it. Would YOU as a person and as a family survive the consequences ? I would , and its nice to know I would. But many other, city dwellers, etc.

        It seems like an awfull fragile concept to base my daily life on at the moment
  • Just what we need. Betcha the next generation of this will end up making my tinfoil hat obsolete. Options?

  • Borrrrrring!
    I'm sure I saw an article in Scientific American like a bajzilllion eons ago (the annual mag devoted to computing?) on ubiquitous computing at Parc Place. It described all their prototype badges, flat panel scribblers, intelligent conference rooms, etc. that were going to change the way we work. Pretty much the same privacy concerns too...

    • Mark Weiser [xerox.com] was the director of Xerox PARC Computer Science Lab, when he first described Ubiquitous Computing [ubiq.com] in 1988.

      The article in Scientific American you saw "like a bajzillion eons ago" was probably the one written about the research at Xerox PARC by Mark Weiser, "The Computer for the Twenty-First Century [ubiq.com]," Scientific American, pp. 94-10, September 1991.

      Parc Place [parcplace.com] was a Xerox PARC spinoff, that made a commercial product out of Smalltalk, which was originally developed at Xerox PARC long before Mark ran the lab. As far as I know, Parc Place didn't have much to do with Ubiquitous Computing -- they just sold a version of the SmallTalk programming language.

      Speaking of pioneering influential Xerox PARC research, has anyone else noticed the striking similarities between Microsoft's ".NET" and Xerox PARC's "Portable Common Runtime [xerox.com]"?

      -Don

      • Hey, thanks!
        I'm getting old (or at least I'm the oldest fogie at this company), and I forgot about the distinction between Parc Place and Xerox PARC.
        Too bad that everyone (Jobs, Gates, et al) stole everything but the carpets on the floor at PARC. Xerox could have been the leader and made a bundle if they could have discovered how to commercialize all that wonderful research...
        *sigh*

  • by D_Fresh ( 90926 ) <slashdot AT dougalexander DOT com> on Tuesday February 19, 2002 @12:58PM (#3032421) Journal
    In his novel A Deepness in the Sky [amazon.com] Vinge talks about sensors so small they are like dust motes floating through the air, but so pervasive (and networked) that the person tapping into them can get detailed surveillance and biometric data anywhere the sensors are floating. Sounds very much like where we're headed.
  • ... boot on my SoniCare toothbrush?

    Sounds stupid, but no more so than an internet-controlled thermostat.
  • eeery (Score:2, Funny)

    by GePS ( 543386 )
    is it just me, or would a beowulf cluster of these things really suck?

    just picture a whole bunch of bugs chirping "you appear to have attempted some form of productivity, is there any way I can obfuscate this process for you?"
  • That's what I get for mindlessly clicking.. I thought that said "Perversive Computing" and I was getting all hot and bothered..

    Waitaminute.. Computers everywhere?? Like "IN" things??? Hmmm....
  • At least they've targeted a conference room. That's a step forward.

    Years ago, I was talking to the "smart room" group at Xerox PARC, which basically had taken home automation to a new, but useless, level for offices. I told them they should do a conference room where all the stuff just works.

    What's needed is stuff like this:

    • HVAC recognizes people load. If fifty people enter a room, fan speeds increase, keeping airflow per person at a constant level.
    • Lighting, blinds/drapes, and screens are controlled by something that can tell if a projector is in use.
    • Smart public address systems; no need to wear a microphone, no manual controls, and never any feedback.
    • Room-sized cell phone node, which handles incoming calls and diverts them to voicemail telling the caller that the destination phone is in a meeting, giving the caller the option to override for emergencies.
    Now those features would actually sell.
  • Pervasive computing is just another term for "Ubiquitous Computing", as described by the late Mark Weiser in 1988, when he was director of the Xerox PARC Computer Science Lab.

    Ubiquitous Computing [ubiq.com]

    Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing.

    Mark Weiser is the father of ubiquitous computing; his web page [ubiq.com] contains links to many papers on the topic.

    Two recent papers express elements of the ubiquitous computing philosophy: "Open House" [nyu.edu] (also in a MS Word version [ubiq.com]) , and "Designing Calm Technology" [ubiq.com].

    What Ubiquitous Computing Isn't

    Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences.

    Early work in Ubiquitous Computing The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" [ubiq.com] built at Xerox PARC, 1988-1994. Several papers [ubiq.com] describe this work, and there are web pages for the Tabs [ubiq.com] and for the Boards [xerox.com] (which are a commercial product now):

    Ubicomp helped kick off the recent boom in mobile computing research [washington.edu], although it is not the same thing as mobile computing, nor a superset nor a subset.

    Ubiquitous Computing has roots in many aspects of computing. In its current form, it was first articulated by Mark Weiser [ubiq.com] in 1988 at the Computer Science Lab at Xerox PARC. He describes it like this:

    Early Work in Ubiquitous Computing

    Ubiquitous Computing #1

    Inspired by the social scientists, philosophers, and anthropologists at PARC, we have been trying to take a radical look at what computing and networking ought to be like. We believe that people live through their practices and tacit knowledge so that the most powerful things are those that are effectively invisible in use. This is a challenge that affects all of computer science. Our preliminary approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible, everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere.

    Ubiquitous Computing #2

    For thirty years most interface design, and most computer design, has been headed down the path of the "dramatic" machine. Its highest ideal is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the "invisible"; its highest ideal is to make a computer so imbedded, so fitting, so natural, that we use it without even thinking about it. (I have also called this notion "Ubiquitous Computing", and have placed its origins in post-modernism.) I believe that in the next twenty years the second path will come to dominate. But this will not be easy; very little of our current systems infrastructure will survive. We have been building versions of the infrastructure-to-come at PARC for the past four years, in the form of inch-, foot-, and yard-sized computers we call Tabs, Pads, and Boards. Our prototypes have sometimes succeeded, but more often failed to be invisible. From what we have learned, we are now explorting some new directions for ubicomp, including the famous "dangling string" display.

    ========

    "Dedicated to the memory of Mark Weiser and Alan Turing" [piemenu.com]

    -Don

    • Calm Technology [berkeley.edu]

      Author: Jim Harris
      Posted: 11/6/2000; 4:57:21 PM
      Topic: Calm Technology

      [Illustration of the Dangling String display] [berkeley.edu]

      Calm Technology is what I call the goal of creating technology that truly honors the full model of human beings. I like this name because it begins with a word, "calm", that points us inward to the domain where we are truly human, and only secondarily mentions technology. Unlike ubiquitous computing, calm technology does not name a method, but a goal. Calm technology stands in sharp contrast to the enfranticing PC of today.

      More from Mark Weiser [darmstadt.gmd.de].

      Weiser comments on Dangling String: "Created by artist Natalie Jeremijenko, the "Dangling String" is an 8 foot piece of plastic spaghetti that hangs from a small electric motor mounted in the ceiling. The motor is electrically connected to a nearby Ethernet cable, so that each bit of information that goes past causes a tiny twitch of the motor. A very busy network causes a madly whirling string with a characteristic noise; a quiet network causes only a small twitch every few seconds. Placed in an unused corner of a hallway, the long string is visible and audible from many offices without being obtrusive."

      Check out The Coming of Age of Calm Technology [ubiq.com], by Mark Weiser and John Seely Brown.

      ========

      -Don

      • Opening Keynote Speech
        The Invisible Interface: Increasing the Power of the Environment through Calm Technology [darmstadt.gmd.de]
        Mark Weiser
        Xerox Palo Alto Research Center (PARC)
        Palo Alto, CA 94304, USA
        Email: weiser@parc.xerox.com

        The information technology revolution, fifty years old, is an infant in the scale of human affairs. It is the culmination of the 350-year tradition of Descartes and Modernism, which created an explosion of technology that also threatens sometimes to bury human beings in its rubble. The characteristics of the modern PC are symptomatic of this entire trend: incredible power within a narrow technical domain, but also isolation from the world, difficulty of use, pulling of us into it and away from other people, distortions of wisdom by what can be digitally measured. This workshop is part of the antidote. Here we bring together many of the leading practitioners of the twenty-first century world-after-the-PC, a world after modernism, a world that is characterized by technology in its proper place, not dominating, but cooperating. A world fundamentally more spiritual, and more calming, than today. Here at this workshop we must avoid the academic tendency to fractionalize, to divide, to emphasize our differences. We are on a common mission: how to create technology that truly honors humans. The challenge dwarfs our disagreements.

        Ten years ago I started on a journey I call Ubiquitous Computing (UC). I am pleased to see that many people at this workshop took some inspiration from UC, and it has been much improved by your many contributions. UC took its inspiration from an anthropological critique of the PC, which said that an isolating, desocializing, distancing technology would eventually change to accommodate human needs. UC tried to anticipate that change by a series of experiments of putting computers into the environment, starting with wall-sized screens, and moving to book-sized and pocket-sized interactive devices. Our focus was on invisibility, at disappearing the "computer" to let the pure human interaction come forward. I must admit to you, largely we failed. Oh, we learned a great deal about user interfaces, radio systems, hand-held design, pen systems, mobile networks, low-cost electronics, batteries, etc, and by the standards of technological excellence and impact we succeeded very well.

        Ubiquitous computing is coming to pass, and our work is widely cited. But we did not succeed at creating the invisibility we craved. We did not because we did not appreciate the enormity of the challenge, primarily the challenge of a proper model of the human being for whom we were designing.

        From the work of all of us here at this workshop, we are getting closer to understanding the right model of a human being, a model that would teach us how to put technology in the background, invisibly. From my own work, and from reading your work, I get this model: Consider a human being as a kind of iceberg. Above the water, sticking out into conscious attention, are those objects and thoughts of which we are currently aware. Below the surface, rooting those thoughts, is a much deeper foundation of tacit assumptions and knowledge. At every moment that we go about our conscious affairs, we are relying upon that deep tacit foundation. For example, as you read this text you are taking in whole words, perhaps even whole concepts. To do this you rely on a tacit base of visual processing, line perception, font perception, grammar, word senses, and so on. What is below the surface of the iceberg is the much larger part of what makes us smart, and makes us human. I call what is above the surface the "center". I call what is below the "periphery".

        Now there are some important characteristics of the center and periphery. First, the more the periphery is engaged, the smarter we are. No amount of conscious working out can replace the intuitions of the expert. The smartest people are the ones who have built up the thickest periphery, and can apply it quickly to new problems. A fully engaged periphery also goes by the name of "flow state", familiar to athletes. Second, we are constantly moving items into and out of the periphery. Millisecond by millisecond what was just periphery becomes center, and then back again. To move perception in and out quickly is a source of great power and comfort. Third, take the periphery away and we are crippled. Imagine looking at the world through narrow tubes taped to our eyes, blocking peripheral vision: you would stumble, and be constantly surprised, and tire quickly. Digital technology in the PC is like those tubes: it presents a view excessively stripped of periphery.

        This model of center and periphery leads to a humble view of the role of technology in human affairs. The ineffable complexity of a given person's active iceberg dominates any situation. The role of technology is to fit in, and not just fit in with what is above the surface, but with what is below as well. In fact fitting with the periphery is far more important, because that is a thicker and richer domain by far than mere conscious attention, and it is also determinative of conscious attention.

        "Calm Technology" is what I call the goal of creating technology that truly honors the full model of human beings. I like this name because it begins with a word, "calm", that points us inward to the domain where we are truly human, and only secondarily mentions technology. Unlike ubiquitous computing, "calm technology" does not name a method, but a goal. "Calm technology" stands in sharp contrast to the enfranticing PC of today.

        When one follows the iceberg down below the surface, one finds not only tacit knowledge, but also the everyday environment. Part of what lies in the periphery is the situation around us, the physical (and cognitive and emotional) affordances of the everyday world. And this is why a workshop on cooperative buildings is at the cutting edge of twenty-first century life. Because it is our buildings that are the primary physical environment of everyday life.

        Is most technology designed today to honor the periphery? Is most information technology encalming? Regretfully not. But in the domain of cooperative buildings, we find very smart and innovative thinkers who are taking this step. Many of you will not find the model I propose above new, because I got it in part from reading your papers. By expressing it to you, I want to move us towards agreement on our common challenge, so that at this workshop we stand on each others shoulders, not each others toes. Let us work together to create a twenty-first century of intense calm. Thank you.

        -Mark Weiser

        ====

        -Don

  • From The Coming Age of Calm Technology [ubiq.com] by Mark Weiser and John Seely Brown:

    There is much talk today about "thin clients," meaning lightweight Internet access devices costing only a few hundred dollars. But Ubiquitous Computing will see the creation of thin servers, costing only tens of dollars or less, that put a full Internet server into every household appliance and piece of office equipment. The next generation Internet protocol, IPv6[5], can address more than a thousand devices for every atom on the earth's surface[6]. We will need them all.

    ====

    -Don

  • Well, as far as conference rooms go, we could be talking some useful, voluntary commercial products here - as per the previous post with the automatic HVAC. Companies will pay for this kind of thing, and it's their own fault if the FBI is listening.

    Of course, if your meetings are run by monkeys, giving the monkeys PDAs, automatic lighting, and centrally controlled mike/PA systems won't do much good.

    With regards to the "sensors everywhere" syndrome, well, I hope you don't have a pacemaker, because the area is clear, we can talk. :)

    Could you imagine how often those sensors would be destroyed? It would be impossibly expensive, just like those red light cameras that keep getting CB caps shot through them...

  • http://www.nist.gov/smartspace/ [nist.gov]

    If you have questions about the system, please feel
    free to ask.
  • I have been interested in this topic for quite some time now, and I'd thought I'd share a white paper [PDF] [ibm.com] from IBM Pervasive Computing [ibm.com] that is along the same lines and goes into much greater detail. Enjoy
  • I'm amazed the story doesn't mention Andy Hopper's work at what is currently called 'AT&T Labs' (which has had a number of owners over the years, and is _not_ something that came out of AT&T).

    Andy Hopper's early work at Cambridge University involved the building of the Cambridge Ring, an early internet. He also created the first implementation of something close to the heart of ever /. reader - ethernet on a chip.

    Over the years he and his companies have done much work on networking technologies, and the applications they enable. Pervasive computing, or sentient computing as they call it, is one of these applications.

    The massively popular remote-control desktop software VNC was a product of this research. It was created so that the sentient computing system could detect which computer monitor you are nearest to and facing, then export your computer's display to it. Once it was decided that this software on its own held little financial promise Andy decided it should be open sourced, to generate good will, further the project and raise its profile. He doesn't believe in sitting on code that you are not fully exploiting.

Work is the crab grass in the lawn of life. -- Schulz

Working...