We were negotiating with the Pentagon, we had a blue screen of death. That was the last straw. When you're holding the moon for ransom, you value stability in an application.Steve

# Intro

Steve was speaking of operating systems, but his words apply to any clockwork. When lives are on the line, lives you want to destroy, will your laser deathray malfunction? This guide provides tools and techniques for making digital superweapons more robust, more efficient, more deadly.

# Declarative

This guide uses the declarative Haskell programming language to illustrate concepts that increase software stability and speed. Haskell is free; open source and community driven; available for major operating systems including Windows, Mac OS X, and Linux; and pure, functional, lazy, type-safe, algebraic, inferential, and parallel. These last attributes will be explained later. For now, focus on installing Haskell Platform.

Haskell Platform

http://hackage.haskell.org/platform/

When Haskell Platform finishes, open a terminal (e.g. Command Prompt, Terminal.app, xterm) and enter `ghci`. Let's begin with the number of the beast.

$ ghci Prelude> 666 666 Prelude> 666 + 111 777 Prelude> let x = 111 in 666 + x 777

The `let` expression clarifies computation by creating local nicknames for more complicated expressions.

Prelude> let x = 1 * 100 + 1 * 10 + 1 * 1 in 666 + x 777

Imperative programming languages tend to perform computations by rote memory assignment, which is why C-based languages are often so verbose: their world view is that of machine code. In contrast to these processor instructions, as a declarative language, Haskell manipulates reducible expressions, eventually reducing them to values. Because Haskell code is specified as expressions rather than instructions, the control flow is not as linear / intuitive as it would be in an imperative language. Letting go of preconceived notions of computation, Haskell is able to perform almost unheard of optimizations such as memoization and parallelization. And so a simple `let` can be quite devious.

Prelude> let 2 + 2 = 5 in 2 + 2 5

Or substitute your doublethink here. Indeed, 2 + 2 ≠ 5. Haskell is not asserting that they are equal, rather, Haskell is temporarily redefining `(+)` for the case `(+) 2 2` within the `let` expression—Haskell primarily uses prefix notation, but provides syntactical sugar for infix notation for common operators. By evaluating the expression `2 + 2` outside the `let`, the proper answer is revealed.

Prelude> 2 + 2 4

# Pure

So, Lone Star, now you see that evil will always triumph because good is dumb.Dark Helmet

If GHCi is the devil's playground, GHC is the devil's workshop. GHC is The Glorious Glasgow Haskell Compilation System. Like GCC, GHC converts source code into executable machine code. Unlike C, however, Haskell lends itself to powerful optimizations, enabling advanced techniques such as infinite data structures and memoization.

Code examples will be saved in `.hs` plain text files with spaces for indentation and run in the terminal using `runhaskell`. Shebangs (`#!...`) are optional; they allow Unix-based systems like Mac OS X and Linux to run programs in the short form `./goodbye.hs`.

goodbye.hs

#!/usr/bin/env runhaskell greeting :: String greeting = "Goodbye " ++ "World!" main :: IO () main = do putStrLn greeting

Example

$ runhaskell goodbye.hs Goodbye World!

The program `goodbye.hs` performs two computations. The outer computation `main` is a *nonpure* computations, printing a string to the terminal. The inner computation `greeting` is a *pure* computation, concatenating two strings together. The difference is that the pure computation merely transforms data while the nonpure computation has the side effect of writing to the stdout I/O stream. Haskell isolates pure code from nonpure code for clarity and utility: pure code can be optimized in a variety of ingenius ways.

# Lazy

Do you seriously think I would explain my master stroke to you if there were even the slightest possibility you could affect the outcome?Ozymandias

By default, Haskell uses lazy evaluation; it performs computations only when it needs to. This may seem like wasteful procrastination, but it's actually a deliberate effort towards optimization.

fib1.hs

#!/usr/bin/env runhaskell fib :: Int -> Int fib 0 = 0 fib 1 = 1 fib n = fib (n - 1) + fib (n - 2) main :: IO () main = do putStrLn (show (fib 30))

Example

$ time ./fib1.hs 832040 12.11 real 1.92 user 0.21 sys

Although calculating fib 30 is rather intensive, this program can be made to run quickly by memoizing the results of previous calls. If Haskell knows that fib 28 is 317811, it doesn't need to recalculate this when calling fib 29.

fib2.hs

#!/usr/bin/env runhaskell import Data.MemoTrie (memo) fib :: Int -> Int fib = memo fib' where fib' :: Int -> Int fib' 0 = 0 fib' 1 = 1 fib' n = fib' (n - 1) + fib' (n - 2) main :: IO () main = do putStrLn (show (fib 30))

Example

$ time ./fib2.hs 832040 2.56 real 1.94 user 0.06 sys

This time, the function calls are memoized, by using a builtin Haskell library, `MemoTrie`. The ordinary, unmemoized function has been renamed `fib'`, pronounced "fib prime". And this function is exposed to the user as `fib`, a memoized version of it. It is not surprising that the function has been sped up by memoization: any programming language can implement a memoized version of a function. What is surprising is that in Haskell this can be accomplished by calling the desired function with `memo func`, which automatically figures out how to memoize the function. This is achieved in Haskell by, once again, its incredibly expressive yet robust type system, which uses mathematical objects called GADTs: generalized algabraic datatypes. Once memoized, Haskell does not actually call the function several hundred times, because it already knows the answer.

Another use of lazy evaluation is infinite data structures.

# Infinite

Gee, Brain, what do you want to do tonight?

The same thing we do every night, Pinky: Try to take over the world!Pinky and the Brain

The RSA algorithm based on prime numbers powers every online transaction in the world. To conquer the primes would conquer RSA and then, the world.

sieve.hs

#!/usr/bin/env runhaskell sieve :: (Integral a) => [a] -> [a] sieve (n:ns) = n : sieve ns' where ns' = filter ((/= 0) . flip rem n) (n:ns) primes :: (Integral a) => [a] primes = sieve [2..] main :: IO () main = do putStrLn (show (take 10 primes))

Example

$ runhaskell sieve.hs [2,3,5,7,11,13,17,19,23,29]

The `primes` function uses the Sieve of Eratosthenes algorithm to enumerate the infinite list of prime numbers. Haskell doesn't need to know all the prime numbers, just specifically requested primes. In this case, the program takes the first 10 primes and prints them. Requesting all the primes results in Haskell returning just that.

$ ghci Prelude> :load sieve.hs [1 of 1] Compiling Main ( sieve.hs, interpreted ) Ok, modules loaded: Main. *Main> primes [2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97,101,103, 107,109,113,127,131,137,139,149,151,157,163,167,173,179,181,191,193,197,199,211, 223,227,229,233,239,241,251,257,263,269,271,277,281,283,293,307,311,313,317,331, 337,347,349,353,359,367,373,379,383,389,397,401,409,419,421,431,433,439,443,449, 457,461,463,467,479,487,491,499,503,509,521,523,541,547,557,563,569,571,577,587, 593,599,601,607,613,617,619,631,641,643,647,653,659,661,673,677,683,691,701,709, 719,727,733,739,743,751,757,761,769,773,787,797,809,811,821,823,827,829,839,853, 857,859,863,877,881,883,887,907,911,919,929,937,941,947,953,967,971,977,983,991, ...

This is not useful. While Haskell encourages infinite data structures, one must be careful to only use finite pieces of the structures. And one must use the right piece; Haskell cannot calculate the last prime, because there is no `last` prime; the list goes on forever.

*Main> last [1, 2, 3] 3 *Main> last primes ...

# Type-Safe

When life gives you lemons, you clone those lemons and make SUPER lemons.Scudworth

Haskell is strictly typed: every expression has a type, determined at compile time. The variable `x` cannot be an integer one minute and a string the next. This is also known as type-safety. As with lazy evaluation, it's a blessing in disguise. Type safety increases robustness without sacrificing flexibility. Haskell does this with algebraic data types, pattern matching, type inference, and type classes.

The example programs so far explicitly state the types of the expressions involved. In the Goodbye World program, the `greeting` is a `String` formed by concatenating two other `String`s, and `main` is an action, `IO`, returning `()`. If `main` were expected to return the greeting after printing it, `main` would have the type `IO String`.

goodbye.hs (explicit types)

#!/usr/bin/env runhaskell greeting :: String greeting = "Goodbye " ++ "World!" main :: IO () main = do putStrLn greeting

Haskell can intelligently determine many types automatically. This is known as type inference. If the type signatures are omitted, the program will still compile and run as before.

goodbye.hs (no type signatures)

#!/usr/bin/env runhaskell greeting = "Goodbye " ++ "World!" main = do putStrLn greeting

Example

$ runhaskell goodbye.hs Goodbye World!

Pattern matching works well with type-safety, as used in the Fibonacci program. The `fib` function is defined by three patterns. The first two patterns, `fib 0` and `fib 1`, are base cases. The last pattern, `fib n` is the general recursive case; it is used whenever `fib` is applied to a value that is not `0` or `1`. Because `fib n` is monotonically decreasing by one, the recursive calls will always reduce to one of the two base case patterns, eventually returning a single value. The absence of edge cases for a given algorithm can be proven rigorously using Coq, an automated proof assistant cousin to Haskell and other ML's.

fib.hs (minimal)

#!/usr/bin/env runhaskell fib :: Int -> Int fib 0 = 0 fib 1 = 1 fib n = fib (n - 1) + fib (n - 2) main :: IO () main = do putStrLn (show (fib 30))

It would be redundant to add new patterns, but they could aid in memoization.

fib.hs (extended)

#!/usr/bin/env runhaskell fib :: Int -> Int fib 0 = 0 fib 1 = 1 fib 2 = 1 fib 3 = 2 fib 4 = 3 fib 5 = 5 fib 6 = 8 ... fib 30 = 832040 fib n = fib (n - 1) + fib (n - 2) main :: IO () main = do putStrLn (show (fib 30))

# Records

Target the Reavers. Target the Reavers! Target everyone! Somebody fire!Operative

The closest thing Haskell has to objects is records. Records encapsulate data in a form more manipulable than objects, but records do not include methods. Data transformations naturally flow from the data types; they are already semantically bundled and there is no need to labor the point by using `object.method()` syntax when `function record` will do. This is how Haskell can get by just fine without requiring Object Oriented Programming, because Haskell's type system informs the programmer how the data can be treated, and informs the compiler at a low level how the data must be treated.

laser.hs

#!/usr/bin/env runhaskell data Target = Fool String | Crowd Integer | TheMoon deriving (Eq, Ord, Show) fireOn :: Target -> String fireOn (Fool s) = s ++ " blasted!" fireOn (Crowd n) = show n ++ "'s murdered!" fireOn TheMoon = "How did you miss a shot like that?"

Example

$ ghci Prelude> :load laser.hs [1 of 1] Compiling Main ( laser.hs, interpreted ) Ok, modules loaded: Main. *Main> fireOn (Fool "Jerry") "Jerry blasted!" *Main> fireOn (Crowd 100) "100's murdered!" *Main> fireOn TheMoon "How did you miss a shot like that?"

The `laser` program declares a data type `Target` with three constructors. A target is either a `Fool String`, a `Crowd Integer`, or `TheMoon`. Pattern matching in the `fireOn` function is used to unpack the target and determine which of the three constructors was used. Then `fireOn` acts on the specific constructor, blasting fools, murdering crowds, and somehow missing the moon.

No matter which constructor was used, Haskell derives how to compare two targets for equivalence, how to order targets in a list, and how to transform targets into strings for printing.

*Main> :m + Data.List *Main Data.List> sort [Crowd 50, Crowd 40, TheMoon, Crowd 60, Fool "Ava"] [Fool "Ava",Crowd 40,Crowd 50,Crowd 60,TheMoon]

Fools come before Crowds come before TheMoon in laser striking priority. In other languages, one would have to write custom `compareTo` and `toString` functions. In Haskell, `deriving (Eq, Ord, Show)` automatically generates basic functionality for any given data type.

Functions can be composed together and used as arguments for other functions.

*Main Data.List> let shout = unwords . take 2 . words . fireOn *Main Data.List> shout TheMoon "How did"

# Robust

Haskell refuses to `fireOn` anything but Fools, Crowds, and TheMoon, because the function is only defined for these. Haskell will not compile code that attempts to `fireOn` Henchmen or MeeMaw, because these are not valid targets. Furthermore, for any valid target, Haskell will always faithfully `fireOn` the target, because Haskell has no `null`. Every expression is of a certain type and can never change type. As seen in the laser program, this by no means limits Haskell's expressiveness, nor does it encumber the programmer. Type-safety increases clarity, robustness, and flexibility, affording the evil genius more time for plotting the demise of his foes.

# Resources

This is only the beginning, Haskell has much darkness to explore.

Parallel Processing with Haskell

http://www.yellosoft.us/parallel-processing-with-haskell

Learn You a Haskell for Great Good!

http://learnyouahaskell.com/

Real World Haskell

http://book.realworldhaskell.org/read/