Lesezeichen  34

  •  

    The Replicating Rapid-prototyper printer (RepRap) is an open source, self-copying 3D printer that works by building objects in layers of plastic, primarily polylactic acid, a bio-degradable polymer made from lactic acid. Unlike existing prototyping printers, RepRap can replicate and update itself, including printing its own parts, says RepRap software developer Vik Olliver. The RepRap development team, is spread throughout New Zealand, the United Kingdom, and the United States. By making the project open source, the team hopes to be able to continue to improve the machine until it can do what people want it to do. Improvements received by the team are then sent back to users, allowing RepRap to evolve as a whole. A recent feature added to RepRap are heads that can be changed for different kinds of plastic. Olliver says a head that deposits low melting-point metal is in development, which means low melting-point metal could be put inside higher melting-point plastic, allowing for the production of structures such as motors. RepRap could also allow people to build circuits in 3D and in various shapes. Having the machine be able to copy itself is the most useful feature the team can give it and is the primary goal of the project, Olliver says.
    vor 17 Jahren von @gwpl
     
     
  •  

    Without significant new investment, the Internet's current network architecture will reach the limits of its capacity by 2010, warned AT&T's Jim Cicconi at the Westminster eForum on Web 2.0 in London. "The surge in online content is at the center of the most dramatic changes affecting the Internet today," Cicconi says. "In three years' time, 20 typical households will generate more traffic than the entire Internet today." Cicconi says at least $55 billion in investments are needed in new infrastructure over the next three years in the United States alone, and $130 billion worldwide. The "unprecedented new wave of broadband traffic" will increase fifty-fold by 2015, Cicconi predicts, adding that AT&T will invest $19 billion to maintain its network and upgrade the core of its network. Cicconi adds that more demand for high-definition video will put an increasing strain on the Internet's infrastructure, noting that eight hours of video is loaded onto YouTube every minute, and that HD video consumes seven to 10 times more bandwidth than normal video. "Video will be 80 percent of all traffic by 2010, up from 30 percent today," he says.
    vor 17 Jahren von @gwpl
     
      acm_technews
       
       
    •  

      Music professors Clifton Callender at Florida State University, Ian Quinn at Yale University, and Dmitri Tymoczko at Princeton University have developed a new way of analyzing and categorizing music using the complex mathematics found in music. The new method, called "geometrical music theory," looks at sequences of notes, chords, rhythms, and scales, and categorizes them so they can be grouped into "families." The families can be given a mathematical structure that can be represented by points in complex geometrical spaces, similar to x-y graphing used in algebra. Different categorizations produce unique geometrical spaces, reflecting the various ways musicians in different times understood music. The researchers say that having tools for conceptualizing music could lead to a variety of applications, such as creating new instruments, new musical toys, and new visualization tools. Tymoczko says the most satisfying part for him is being able to see the logical structure that links many different musical concepts. "To some extent, we can represent the history of music as a long process of exploring different symmetries and different geometries," he says.
      vor 17 Jahren von @gwpl
       
        acm_technews
         
         
      •  

        Experts at FutureNet, an annual conference held to address communications services, say the Internet architecture will face severe challenges over the next few years that could significantly strain the Web's effectiveness. One of the most prominent issues facing the Internet is the impending shortage of IP addresses, which some forecasters say could occur within the next few years. IPv4 offers about 4.7 billion possible IP addresses, but it is running out of capacity. Juniper's Ron Bonica says there are three likely solutions to this problem. The first is to sick with IPv4, which would create some immediate problems with the impending shortage of IP addresses but would also lead to the creation of an IP address trading system through which companies and individuals that own an excessive number of addresses could sell them at market value. Another possibility would be a rapid deployment of IPv6, the next generation Internet Protocol that is capable of supporting several billion more addresses than IPv4. Bonica says many companies and organizations are reluctant to make the switch because it will require significant investments on the part of end users and ISPs, and transition mechanisms to help make the switch have not been deployed yet. Bonica says the third option is a compromise between these two solutions that involves a gradual shift from IPv4 to IPv6. Another issue addressed FutureNet addressed was the strain more IP addresses will place on routing tables, which are not scalable and cannot adapt to exponential increases in IP addresses. "The basic, fundamental problems of scaling a network haven't been addressed in any innovative manner," says American Registry of Internet Numbers Chairman John Curran.
        vor 17 Jahren von @gwpl
         
          acm_technews
           
           
        •  

          Optimizing the capabilities of multicore processors in all sorts of products requires bridging the chasm between processors' and software's capability, and industry sources say the long-term focus should be on figuring out a way to write code for parallel computing. "We don't even know for sure what we should be teaching, but we know we should be changing what we're teaching," says University of California, Berkeley professor David Patterson, a former president of ACM. UC Berkeley and the University of Illinois at Urbana-Champaign will split $20 million from Intel and Microsoft to underwrite Universal Parallel Computing Research Centers over the next five years, with Berkeley's share going toward the enhancement of research already done by the school's Parallel Computing Laboratory and the hiring of 50 researchers to focus on the problem of writing software for parallelism. Patterson says Berkeley has started introducing freshmen to parallel computing through classes focusing on the "map-reduce" method, while upperclassmen are being given a grounding in "sticky" parallelism issues such as load balancing and synchronization. Patterson acknowledges that an entirely new programming language may need to be invented in order to tackle the challenge of parallel computing. Brown University professor Maurice Herlihy says a more likely possibility is the evolution of parallel programming features by existing languages--a view endorsed by AMD's Margaret Lewis, who cites the necessity of interim solutions to amend legacy software written for unicore processors along with software under development. Lewis says AMD is trying to infuse parallel coding methods via compilers and code analyzers, noting that with these interim solutions "programmers aren't getting the full benefits of parallelism ... but it runs better in a multicore environment."
          vor 17 Jahren von @gwpl
           
            acm_technews
             
             

          Publikationen  229