bookmarks  274

  •  

     
  •  

    an article about the usage of twitter with the result, that 90% of all tweets were produced by just 10% of all twitter-users. there was also some research about the gender's influence on twitter usage ·
    5 years and 7 months ago by @astrochicken
     
      research science twitter usage
      (0)
       
       
    •  

      Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting. ·
      5 years and 7 months ago by @cschie
      (0)
       
       
    •  

      Mark Gibbs ponders how to analyze Twitter for a specific search term using the Twitter search API and Microsoft Excel. Dazu die Links zu den weiteren Teilen 2-4. ·
      5 years and 8 months ago by @astrochicken
       
        analysis excel twitter
        (0)
         
         
      •  

        Following up on KMeans Clustering Now Running on Elastic MapReduce, Stephen Green has generously documented the steps that was necessary to get an example of k-Means clustering up and running on Amazon’s Elastic MapReduce (EMR) on the Apache Lucene Mahout wiki. ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        Uploading data, for example your off-site backup files, to Amazon S3 is easier than you might think. Here are some basic steps with links. ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        Will Scully's Blog on Data, Analytics... ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        Check out some of the new apps over at Chartbeat. Very cool real-time analytics with alerts. I wonder when Woopra when announce something similar? When will Google Analytics become real-time and provide alerts for its millions of users? Hat off to the chaps over at Chartbeat - beating them to the game! Well done. ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        The book is out: Yahoo! Web Analytics: Tracking, Reporting, and Analyzing for Data-Driven Insights. His philosophy is that you should focus on three different but equally important tasks; A) Collecting Data, B) Reporting on Data  and C) Deriving insight from Data. Dependant ones vantage point, one or more of the chapters will be in focus. He has divided the book into three parts to reflect these broad tasks. Part 1, “Advanced Web Analytics Installation,” consists of Chapters 1 through 5. The focus is on data collection. True competitive advantage in web marketing comes from collecting the right data, but also, and no less important, from configuring your web analytics tool in such a way that you can derive insight from the data. Part 1 features detailed code examples that webmasters or developers can apply directly. Marketing people and executives will learn the opportunities they can demand from this tool. He also shows you how to add reporting dimensions to the predefined report structures for fantastic filtering and segmentation opportunities. Part 2, “Utilizing an Enterprise Web Analytics Platform,” encompasses Chapters 6 through 10, where he focuses on reports. Creating reports is an easy feat, but remember that reports are never better than the data you collect. You need an exceedingly good understanding of how to work with your data. Part 2 is less technical than the first part. In it he teaches you to use your reporting toolbox to provide targeted answers to specific questions, such as “How much revenue did we make from first-time organic search visitors from Canada last week?” For this and many other questions you’ll encounter there is no standard report, but you will know how to get this answer and hundreds of others when you’re through with this section. Part 3, “Actionable Insights,” encompasses Chapters 11 through 13 and focuses on how to take action on your data to optimize your web property. Having gone through the effort of implementing the data collection and reporting strategies in Parts 1 and 2, you will have gained enough insight to start an optimization process. Part 3 introduces you to optimization using a set of actionable insights. This is merely an appetizer, and the handful of optimizations he presents are not, by any means, the only ones you can pursue. But the ideas and attitude behind them can most definitely be copied and carry you down other optimization avenues. Think of this section as an idea catalog. One of the most important questions he tackles in this section is paid search optimization. ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        Der Management Monitor des Steinbeis-Transferzentrums für Unternehmensentwicklung (STZUE) in Pforzheim erhielt kürzlich den Innovationspreis IT der Initiative Mittelstand im Bereich Business Intelligence (BI). Ein Gespräch mit der Professorin Elke Theobald, dem Kopf hinter der Innovation. ·
        5 years and 8 months ago by @astrochicken
        (0)
         
         
      •  

        Confusion about Services Based Architectures [SBA, SOA, EDA, ...] has been created by a number of industry elements. Industry critics like Forrester first used the term Services Based Architecture until 2000 when Gartner came up with their own term Services Oriented Architectures (SOA).  Forrester was still using the term SBA in 2002. Gartner next created the term Event Driven Architecture and has now come full circle back to SOA 2.0 (supporting both SOA and EDA like the original SBA). ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        I got an update on the Oracle Business Rules product recently. Oracle is an interesting company - they have the components of decision management but do not yet have them under a single umbrella. For instance, they have in-database data mining (blogged about here), the Real Time Decisions (RTD) engine, event processing rules and so on. Anyway, this update was on business rules. ·
        5 years and 8 months ago by @cschie
        (0)
         
         
      •  

        One year ago I penned Event Processing in Twitter Space, and today parts of the net are buzzing about Twitter. In a nutshell, Twitter is a one-to-many communications service that uses short messages (140 chars or less). Following on the heels of the blogging phenomena, Twitter has been primarily used for microblogging and group communications. Twitter, and Twitter-like technologies, has great promise in many areas.   For example, you could be subscribed to the @tsunamiwarning channel on your dream island vacation and get instant updates on potential disasters.   A team of people working in network management could subscribe to the @myserverstatus channel and receive updates on their health of their company IT services.   Passengers could subscribe to the @ourgatestatus channel and follow up-to-date information on their fight. Twitter was created to answer the simple question, “What are you doing now?” ·
        5 years and 8 months ago by @cschie
         
          CEP event processing twitter
          (0)
           
           
        •  

          Operational business intelligence (BI) has a focus on day-to-day operations and so requires low-latency or real-time data to be integrated with historical data. It also requires BI systems that are integrated with operational business processes. However, while operational BI might be part and parcel of operational processes and systems, the focus is still on changing how people make decisions in an operational context. To compete on decisions, however, you must recognize that your customers react to the choices made by you, your staff and your systems, and that you must manage all the decisions you (or your systems) make – even the very small ones. This is the basis for enterprise decision management or EDM. Five main areas of difference exist between operational BI and EDM – a focus on decisions (especially operational ones), organizational integration, analytic technology change, adoption of additional technology and adaptive control. In this article, I want to outline some steps organizations can take as they move from “traditional” BI towards operational BI and enterprise decision management. Some of these steps would be a good idea if operational BI was your goal. But hopefully you are more ambitious than that and want to really begin to compete on decisions. ·
          5 years and 8 months ago by @cschie
          (0)
           
           
        •  

          Enterprise architecture is a management practice that was initially developed within the IT discipline to manage the complexity of IT systems, as well as the ongoing change constantly triggered by business and technology developments. Today, one of the primary reasons EA is adopted in organizations worldwide is to promote alignment between business requirements and IT solutions. EA is expanding into other business disciplines, as well: to enable business strategy development, improve business efficiency, facilitate knowledge management and assist with organizational learning, to name a few. In order to effectively implement EA in organizations, architects are increasingly looking for best practices and frameworks to assist them. One of the few architecture frameworks publicly available to guide architects in their implementation is TOGAF. Put simply, TOGAF is a comprehensive toolset for assisting in the acceptance, production, use and maintenance of enterprise architectures. It is based on an iterative process model supported by best practices and a reusable set of existing architectural assets. Since it was developed by members of The Open Group Architecture Forum more than 10 years ago, TOGAF has emerged as arguably the de facto standard framework for delivering enterprise architecture. ·
          5 years and 8 months ago by @cschie
          (0)
           
           
        •  

          5 years and 8 months ago by @cschie
          (0)
           
           
        •  

          In the book The Art of War for Executives, Donald G. Krause interprets the following: “Sun Tsu notes, superior commanders succeed in situations where ordinary people fail because they obtain more timely information and use it more quickly.” For metadata professionals, this observation is increasingly relevant as more and more of the business seeks integration and federation, alignment with business goals and strategies, and agility - the ability to respond both quickly and accurately to change. Industry analysts and IT professionals are less focused on solutions to problems where metadata management plays a role but rather look more to metadata management as an overall strategy for the benefits it provides to multiple aspects of the whole organization. ·
          5 years and 8 months ago by @cschie
          (0)
           
           
        •  

          What they’ve learned is that for every new ETL script, there are probably 20 other systems that have to custom developed their own data retrieval code and never documented it. ·
          5 years and 8 months ago by @cschie
           
            BI ETL SOA
            (0)
             
             
          •  

             
          •  

            In 2009, Web analytics managers have a multitude of different tools to select to deploy at their corporation.  Sets of tools from industry leaders, such as Omniture, WebTrends, Unica, CoreMetrics , Google, and Yahoo, are among the most popular, while options from smaller players like ClickTracks and Woopra exist as well.  In theory, you deploy a tool, customize it to fit your needs, and start analyzing the reports — and it all goes swimmingly, right? Then why have many corporations already chewed through two, maybe even three tools over the last several years, or deployed multiple tools in an attempt to arrive at where they need to be — delivering comprehensive and systematic analysis to their business community, helping to drive action from insight and taking the mantra of “competing on analytics” and “data driven culture” to the next level?  Several factors cause disconnects between the promise of a tool and the successful use of a tool, which cause a tool to fail: ·
            5 years and 8 months ago by @cschie
             
              2.0 adoption analytics web
              (0)
               
               

            publications  116