bookmarks  34

  •  

    In "Augmenting Human Intellect: A Conceptual Framework," Doug Engelbart, head of the Augmentation Research Center at Stanford Research Institute, presented a philosophy that favored efficiency over ease-of-use in human-computer interaction, notes Richard Monson-Haefel. In essence, Engelbart felt that basing computer interactions on the most efficient systems was the best way to achieve an optimal human-computer symbiosis. Monson-Haefel thinks the best embodiment of Engelbart's views is his five-finger keyboard, which is designed for use with one hand and carries out very rapid data entry and computer interactions when combined with a computer mouse, which Engelbart also conceived of. The keyboard-mouse combination was very tough to learn, which points to the crux of Engelbart's dilemma: More efficient and potentially more powerful human-computer interfaces have a very steep learning curve. Monson-Haefel says the modern approach to human-computer interaction stresses ease-of-use and usability without training, which runs counter to Engelbart's philosophy, which led to some of the most exceptional computer technologies in use today. The author does not think Engelbart's preference for efficiency is a completely unsound notion, and he reasons that "perhaps, like the violin, people could reach a new level of synergy with computers if they followed Engelbart's philosophy and focused on efficiency over ease-of-use."
    16 years ago by @gwpl
     
     
  •  

     
  •  

    University of Arizona researchers are developing hybrid hardware/software systems that could eventually use machine intelligence to allow spacecraft to fix themselves. Arizona professor Ali Akoglu is using field programmable gate arrays (FPGA) to build self-healing systems that can be reconfigured as needed to emulate different types of hardware. Akoglu says general-purpose computers can run a variety of systems but they are extremely slow compared to hard-wired systems designed to perform specific tasks. What is needed, Akoglu says, are systems that combine the speed of hard-wired systems with the flexibility of general-purpose computers, which is what he is trying to accomplish using FPGAs. The researchers are testing five wirelessly-linked hardware units that could represent a combination of five landers and rovers on Mars. Akoglu says the system tries to recover from a malfunction in two ways. First, the unit tries to fix itself at the node level by reprogramming malfunctioning circuits. If that fails, the unit tries to recover by employing redundant circuitry. If the unit's onboard resources cannot fix the problem, the network-level intelligence is alerted and another unit takes over functions that were done by the broken unit. If two units go down, the three remaining units divide the tasks. "Our objective is to go beyond predicting a fault to using a self-healing system to fix the predicted fault before it occurs," he says.
    16 years ago by @gwpl
     
      acm_technews
       
       
    •  

      The Defense Advanced Research Projects Agency has issued a call for research proposals to design compilers that can dynamically optimize programs for specific environments. As the Defense Department runs programs across a wider range of systems, it is facing the lengthy and manual task of tuning programs to run under different environments, a process DARPA wants to automate. "The goal of DARPA's envisioned Architecture-Aware Compiler Environment (AACE) Program is to develop computationally efficient compilers that incorporate learning and reasoning methods to drive compiler optimizations for a broad spectrum of computing system configurations," says DARPA's broad area announcement. The compilers can be written in the C and Fortran programming languages, but the BAA encourages work in languages that support techniques for the parallelization of programs. The quality of the proposals will determine how much DARPA spends on the project, which will run at least through 2011. Proposals are due by June 2.
      16 years ago by @gwpl
       
       
    •  

      At the International World Wide Web Conference in Beijing, two Google researchers unveiled VisualRank, software they say will advance digital image searching on the Web the same way Google's PageRank software advanced Web page searches. VisualRank is an algorithm that blends image-recognition software methods with techniques that weigh and rank images that look the most similar. Most image searches are based on cues from the text associated with each image, and not on the actual content of the image itself. Image analysis is a largely unsolved problem in computer science, the Google researchers say. "We wanted to incorporate all of the stuff that is happening in computer vision and put it in a Web framework," says Google's Shumeet Baluja, who made the presentation along with Google researcher Yushi Jing. Their paper, "PageRank for Product Image Search," focuses on a subset of the images that Google has cataloged. The researchers concentrated on the 2,000 most popular product queries on Google's product search, and sorted the top 10 images from both its ranking system and the standard Google Image Search results. The research effort then used a team of 150 Google employees to create a scoring system for image "relevance." The researchers say VisualRank returned 83 percent less irrelevant images.
      16 years ago by @gwpl
       
        acm_technews
         
         
      •  

        Non-repudiation is a system whereby sensitive data sent over the Internet is digitally signed at the source with a signature that can be traced to the user's computer as a safeguard against fraud, but Len Sassaman of the Catholic University of Leuven warns that making this system the default setting for all traffic on a network would enable authorities to trace the source of any online activity and take away users' anonymity. Worse still, Sassaman and University of Ireland colleague Meredith Patterson say that the One Laptop per Child (OLPC) foundation is unintentionally engaged in establishing such a system throughout the Third World by supplying inexperienced users Internet-ready laptops. Theft of the laptops is discouraged with a security model called Bitfrost in which each laptop automatically phones an anti-theft server and sends its serial number once a day so that it can get an activation key, and any machine reported stolen is refused activation. Sassaman and Patterson caution that the security model's use of non-repudiable digital signatures could be exploited by oppressive regimes to identify and silence dissidents. "They may not intend for the signatures to be used for non-repudiation, but it's possible to use them for this purpose," Sassaman says. Although the OLPC laptops are primarily intended to be used for educational purposes, which some people claim would preclude government monitoring, Sassaman says it is unlikely that the systems will be used solely by children, and that conditions in some developing nations might actually encourage children to act as whistleblowers. Sassaman and Patterson are modifying the Bitfrost security model to enable the laptops to identify each other without compromising their users' privacy, based on existing cryptographic methods that cannot be employed for non-repudiation.
        16 years ago by @gwpl
         
          acm_technews
           
           
        •  

          Quantum computers would be able to process information in ways that standard computers cannot by tapping the unusual properties of quantum mechanics, but an analysis suggests that quantum computers would outclass conventional machines only by a slight degree for most computing problems, writes MIT professor Scott Aaronson. Evidence now indicates that quantum machines would be susceptible to many of the same algorithmic restrictions as classical computers, and these restrictions are totally independent of the practical problems of constructing quantum computers. A solid quantum computer algorithm would guarantee that computational paths leading to an incorrect answer neutralize while paths reading to a right answer reinforce, Aaronson says. The discovery of an efficient quantum algorithm to solve NP-complete problems remains elusive despite much effort, but one definite finding is that such an algorithm would have to efficiently take advantage of the problems' structure in a manner that is outside the capabilities of present-day methods. Aaronson points out that physicists have yet to come up with a final theory of physics, which gives rise to the possibility that a physical way to efficiently solve NP-complete problems may one day be revealed by a future theory. "People speculate about yet more powerful kinds of computers, some of which would make quantum computers look as pedestrian as vending machines," he notes. "All of them, however, would rely on speculative changes to the laws of physics." Aaronson projects that the difficulty of NP-complete problems will someday be perceived as a basic principle that describes part of the universe's fundamental nature.
          16 years ago by @gwpl
           
           
        •  

          Robots could fill 3.5 million jobs in Japan by 2025, concludes a new Machine Industry Memorial Foundation report. The report says robots have the potential to save $21 billion on health care costs for the elderly by 2025. Robots could help caregivers with children or older people by reading books out loud or helping bathe the elderly, and they also could do some housework. People would be able to focus on more important things, including caregivers, who could gain more than an extra hour a day as a result of such assistance. The robots could range from micro-sized capsules that detect lesions to high-tech vacuum cleaners, but it could take more time before they have a big impact in Japan. "There's the expensive price tag, the functions of the robots still need to improve, and then there are the mindsets of people," says Takao Kobayashi, who worked on the study. "People need to have the will to use the robots."
          16 years ago by @gwpl
           
           
        •  

          Computer scientist Donald E. Knuth, winner of ACM's A.M. Turing Award in 1974, says in an interview that open-source code has yet to reach its full potential, and he anticipates that open-source programs will start to be totally dominant as the economy makes a migration from products to services, and as increasing numbers of volunteers come forward to tweak the code. Knuth admits that he is unhappy about the current movement toward multicore architecture, complaining that "it looks more or less like the hardware designers have run out of ideas, and that they're trying to pass the blame for the future demise of Moore's Law to the software writers by giving us machines that work faster only on a few key benchmarks!" He acknowledges the existence of important parallelism applications but cautions that they need dedicated code and special-purpose methods that will have to be significantly revised every several years. Knuth maintains that software produced via literate programming was "significantly better" than software whose development followed more traditional methodologies, and he speculates that "if people do discover nice ways to use the newfangled multithreaded machines, I would expect the discovery to come from people who routinely use literate programming." Knuth cautions that software developers should be careful when it comes to adopting trendy methods, and expresses strong reservations about extreme programming and reusable code. He says the only truly valuable thing he gets out of extreme programming is the concept of working in teams and reviewing each other's code. Knuth deems reusable code to be "mostly a menace," and says that "to me, 're-editable code' is much, much better than an untouchable black box or toolkit."
          16 years ago by @gwpl
           
            acm_technews
             
             
          •  

            Both young men and women are avoiding high school courses that could lead to careers in IT, but young women are dropping those courses faster than young men, says Australia's Charles Sturt University Faculty of Education dean Toni Downes. Downes was a senior member of a research project that examined the interest of male and female high school students in particular high school subjects. The study of 1,334 male and female students found that only 13 percent of girls said they would study IT-related subjects in their senior years, and both boys and girls shied away from high school computing and IT subjects between 2002 and 2007. Downes believes that a shift in computer curriculum from a combination of computer literacy and foundational studies to computing and IT as an academic discipline has contributed to the decline in enrollments, particularly among females. "The reasons are complex, but the reasons that girls give are often the same reasons that disinterested boys give," Downes says. "Sometimes they are making their judgments on careers based on stereotypes, sometimes the girls are making their decisions based on self-limiting identities like 'it's not cool for me to be a nerd' because they think the career is nerdy." Downes says part of the problem is that girls do not engage with technology in ways that allow them to use it playfully, instead of just functionally, so they are not attracted to thinking creatively or critically about how and why technology works.
            16 years ago by @gwpl
             
              acm_technews
               
               
            •  

              Many women in IT credit their mothers for making them believe they could succeed in any career. IT and service manager Priscilla Milam says when she got into computer science there were no other women in the program, and it was her mother who told her to learn to live in a man's world, encouraging her to read the headlines in the financial pages, sports pages, and general news, and not to get emotional. "Still, IT in general is a man's world, and by keeping up with the news and sports, when the pre/post meetings end up in discussions around whether the Astros won or lost or who the Texans drafted, I can participate; and suddenly they see me as part of the group and not an outsider," Milam says. Catalyst says the percentage of women holding computer and mathematics positions has declined since 2000, from 30 percent to 27 percent in 2006. Milam and other women in high-tech positions say a passion for technology begins early in life and a few encouraging words from their mothers helped them realize they could overcome the challenges that exist when entering an industry dominated by men. CSC lead solution architect Debbie Joy says the key to succeeding in IT is to put gender aside at work and learn to regard colleagues as peers, and soon they will do the same.
              16 years ago by @gwpl
               
                acm_technews
                 
                 
              •  

                Nintendo set to launch "Wii Fit" exercise game For years, video games have been blamed for turning kids into idle layabouts who only venture off the couch to fill up on potato chips and soda. Nintendo Co Ltd now aims to shatter that image with a game that aims to get players off the couch and lead them to stretch, shake and sweat their way to a healthy life.
                16 years ago by @gwpl
                 
                  acm_technews
                   
                   
                •  

                  Researchers led by Carnegie Mellon University professor David Brumley have found that software patches could be just as harmful as they are helpful because attackers could use the patches to automatically generate software in as little as 30 seconds that attacks the vulnerabilities the patch is supposed to fix. The malicious software could then be used to attack computers that had not received and installed the patch. Microsoft Research's Christos Gkantsidis says it takes about 24 hours to distribute a patch through Windows Update to 80 percent of the systems that need it. "The problem is that the infrastructure capacity that exists is not enough to serve all the users immediately," Gkantsidis says. "We currently don't have enough technologies that can distribute patches as fast as the worms." This distribution delay gives attackers time to receive a patch, find out what it is fixing, and create and distribute an exploit that will infect computers that have not yet received the patch. The researchers say new methods for distributing patches are needed to make them more secure. Brumley suggests taking steps to hide the changes that a patch makes, releasing encrypted patches that cannot be decrypted until the majority of users have downloaded them, or using peer-to-peer distribution methods to release patches in a single wave.
                  16 years ago by @gwpl
                   
                    acm_technews
                     
                     
                  •  

                    Who are the best spreaders of information in a social network? The answer may surprise you.
                    15 years ago by @gwpl
                     
                     

                  publications  229