I'm interested in the relative merits of ATS vs. Epigram which is a Pure Type System that seeks to unify types and terms, where ATS distinguishes the statics and dynamics of the language. What benefits and limitations do these approaches have on complexity for both developer and implementor? notes in atsVepig.txt
Logically Qualified Data Types, abbreviated to Liquid Types, a system that combines Hindley-Milner type inference with Predicate Abstraction to automatically infer dependent types precise enough to prove a variety of safety properties. Liquid types allow programmers to reap many of the benefits of dependent types, namely static verification of critical properties and the elimination of expensive run-time checks, without manual annotation. We have implemented liquid type inference in Dsolve, which takes as input an Ocaml program and a set of logical qualifiers and infers dependent types for the expressions in the Ocaml program.We describe experiments using Dsolve to statically verify the safety of array accesses on a set of Ocaml benchmarks that were previously annotated as part of the DML project. When used with a simple set of bounds checking qualifiers, Dsolve reduces manual annotation required from 31% of program text to under 1%.
As part of a larger project, we have built a declarative assembly language. This language enables us to specify multiple code paths to compute particular quantities, giving the instruction scheduler more flexibility in balancing execution resources for superscalar execution. The instruction scheduler is also innovative in that it includes aggressive pipelining, and exhaustive (but lazy) search for optimal instruction schedules. We present some examples where our approach has produced very promising results. I think this paper is a nice followup to the recent discussion of SPE because it goes further than that paper by analyzing what data are necessary to achieve the ultimate goal of optimal or near-optimal instruction scheduling on superscalar architectures. In other words, it strongly suggests that we can do better than simply embedding low-level instructions in a high-level language by instead embedding a graph of desired results (vertices) and instructions for reaching them
The abstract: We propose a novel, comonadic approach to dataflow (streambased) computation. This is based on the observation that both general and causal stream functions can be characterized as coKleisli arrows of comonads and on the intuition that comonads in general must be a good means to structure context-dependent computation. In particular, we develop a generic comonadic interpreter of languages for context-dependent computation and instantiate it for stream-based computation. We also discuss distributive laws of a comonad over a monad as a means to structure combinations of effectful and context-dependent computation. If you've ever wondered about dataflow or comonads, this paper is a good read. It begins with short reviews of monads, arrows, and comonads and includes an implementation. One feature that stood out is the idea of a higher-order dataflow language.
Does anyone know of work done on co-data-types? There has been a lot of work done on co-data and stream programming, but the assumption is usually of a very simple repetitive data stream. Has anyone done any work on highly structured co-data or co-data-types? A data structure passed into a function represents a limit on all of it's constituent types, that is, the structure must be fully constructed (AND logic) before the function can be called. But a co-data structure (co-type) passed into a function represents a co-limit and only needs to be partially constructed (OR logic), perhaps the function can even return partial (co-data) results based upon the co-data-type it receives. This seems to me a very relevant topic for advancing stream programming and the exploitation of multiple layers of data parallelism.
Chuck Thacker is building a new research computer called the BEE3. There was a time, years ago, when computer architecture was a most exciting area to explore. Talented, young computer scientists labored on the digital frontier to devise the optimal design, structure, and implementation of computer systems. The crux of that work led directly to the PC revolution from which hundreds of millions benefit today. Computer architecture was sexy. These days? Not so much. But Chuck Thacker aims to change that.