Parallel Programming

Lagom Framework


alternatively, use g8 template to start new project

cd ~/src/scala/lagom/lagom-scala-1.4
sbt -Dsbt.version=0.13.15 new
# start services
sbt runAll

CURL examples

curl http://localhost:9000/api/hello/Alice
curl -H "Content-Type: application/json" -X POST -d '{"message":"Hi"}' http://localhost:9000/api/hello/Alice


create new project
sbt new sbt/scala-seed.g8

sbt can search your local Maven repository if you add it as a repository:

resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"

or, for convenience:

resolvers += Resolver.mavenLocal

Dependency Management



curl -L -o coursier && chmod +x coursier && ./coursier --help



mkdir -p ~/.ammonite && curl -L -o ~/.ammonite/
sudo curl -L -o /usr/local/bin/amm && sudo chmod +x /usr/local/bin/amm
  • create script, e.g. ~/Downloads/
  • execute script cd ~/Downloads && amm
// print banner
println("Hello World!!")

// common imports
import sys.process._
import collection.mutable

// common initialization code
val x = 123
println("x is " + 123)


  "language" : "scala211",
  "display_name" : "Scala 2.11",
  "argv" : [
  • change file permissions
chmod -R 777 $HOME/.local/share/jupyter
  • run ./jupyter-scala inside the directory



Mark Lewis

Author of the Introduction to Scala series of books for CS1, CS2 and Data Structures:

  • Introduction to the Art of Programming Using Scala (2013)
  • Introduction to Programming and Problem Solving Using Scala (2017)
  • Object-Orientation, Abstract, and Data Structures Using Scala (2017)




EC2 Spot Request

AWS profiles defined at ~/.aws/config. Additional information at aws cli: config-vars and aws cli: request-spot-instances

aws ec2 request-spot-instances --cli-input-json file:///home/xps13/src/scala/observatory/ec2-config.json --profile admin

Contents of ec2-config.json

  "SpotPrice": "2.064",
  "LaunchSpecification": {
    "ImageId": "ami-af0fc0c0",
    "InstanceType": "c3.8xlarge",
    "KeyName": "ami-ca46b6a5-rstudio",
    "SecurityGroupIds": [
  "InstanceCount": 1,
  "BlockDurationMinutes": 120
  • edit security group, add SSH (port 22) from My IP or Anywhere
  • navigate to folder containing keypair: $ cd ~/Dropbox/Logins/Amazon/EC2
  • remove connection from /home/xps13/.ssh/known_hosts
  • connect to instance
ssh -i "ami-ca46b6a5-rstudio.pem"
sudo yum update -y                                           # get latest updates
sudo yum install -y docker                                   # install docker (Amazon Linux)
sudo service docker start                                    # start the Docker service
sudo docker pull hseeberger/scala-sbt                        # install docker image
sudo docker run -it --rm hseeberger/scala-sbt                # run image
git clone # clone repo
gpg --passphrase "" observatory/                   # decrypt
source observatory/                                    # set S3 credentials
mkdir observatory/src/main/resources                         # create data folder
wget     # download data
unzip -d observatory/src/main/       # extract data
cd observatory && sbt "run-main observatory.ec2 1975-2015 epfl-observatory stations.csv .csv target/temperatures"

Performace of c3.4xlarge instance running Amazon Linux

CPU Utilization (max)
Year Zoom Time (sec) Tiles Time per tile (sec)
1975 0 22 4 5.5
1975 1 60 9 6.6
1975 2 180 25 7.2
1975 3 600 81 7.41
Item Value
Tiles per year 4+9+25+81=119
Years 2016-1975=41
Total number of tiles 119*41=4879
Average time per tile (sec) 7.2
Est. Total time (sec) 4879*7.2=35128.8
Est. Total time (hrs) 9.8

It is possible to process approx. 10 years in 3 hours.


sbt assembly: Standalone jar with all dependencies

If you want to build a standalone executable jar with dependencies, you may use the sbt-assembly plugin (e.g. include in ~/.sbt/0.13/plugins/plugins.sbt). And then build it by

sbt assembly

The standalone will be in target/project_name-assembly-x.y.jar.

You can run it by

java -jar project_name-assembly-x.y.jar [class.with.main.function]

sbt package

create /target/scala-2.11/capstone-observatory_2.11-0.1-SNAPSHOT.jar for distribution

sbt package

Include/exclude files in the resource directory

When sbt traverses unmanagedResourceDirectories for resources, it only includes directories and files that match includeFilter and do not match excludeFilter. includeFilter and excludeFilter have type and sbt provides some useful combinators for constructing a FileFilter. For example, in addition to the default hidden files exclusion, the following also ignores files containing impl in their name,

excludeFilter in unmanagedSources := HiddenFileFilter || "*impl*"

To have different filters for main and test libraries, configure Compile and Test separately:

includeFilter in (Compile, unmanagedSources) := "*.txt"
includeFilter in (Test, unmanagedSources) := "*.html"

Note: By default, sbt includes all files that are not hidden.

jar Commands

check contents of .jar
jar tvf ./target/scala-2.11/capstone-observatory_2.11-0.1-SNAPSHOT.jar

Amazon Web Services


How to use

libraryDependencies += "com.github.seratch" %% "awscala" % "0.6.+"

Configure credentials in the AWS Java SDK way: credentials

add to ~/.profile:

export AWS_ACCESS_KEY_ID="..."

check value of environment variables using sys.env("AWS_ACCESS_KEY_ID")

S3 example

import awscala._, s3._

implicit val s3 =

val buckets: Seq[Bucket] = s3.buckets
// val bucket: Bucket = s3.createBucket("unique-name-xxx")
val bucket = buckets(0)
val summaries: Seq[S3ObjectSummary] = bucket.objectSummaries

// bucket.put("sample.txt", new"sample.txt"))
bucket.put("0-0.png", new"target/temperatures/1976/0/0-0.png"))

// val s3obj: Option[S3Object] = bucket.getObject("sample.txt")
val s3obj: Option[S3Object] = bucket.getObject("index.html")
val s3url: String = bucket.getObject("index.html").get.publicUrl.toURI.toString

s3obj.foreach { obj =>
  val url = obj.publicUrl.toURI.toString //
  // obj.generatePresignedUrl( // ?Expires=....
  // bucket.delete(obj) // or obj.destroy()

Problem Solving


This is the official repository for the CafeSat source code. CafeSat is a SAT/SMT solver written entirely in Scala. CafeSat attempts provides an efficient command-line tool to solve SMT problems, as well as a library for Scala programs that need the capabilities of SAT/SMT solvers.


Tägliche Intervalle reichen aber häufig nicht mehr aus. Gefragt ist Geschwindigkeit: Analysen und Auswertungen werden zeitnah erwartet und nicht Minuten oder gar Stunden später. An dieser Stelle kommt das Stream Processing ins Spiel: Daten werden verarbeitet, sobald sie dem System bekannt sind. Begonnen hat dies mit der Lambda Architektur (vgl. [1]), bei der die Stream- und Batch-Verarbeitung parallel erfolgen, da die Stream-Verarbeitung keine konsistenten Ergebnisse garantieren konnte. Mit den heutigen Systemen ist es auch möglich, nur mit Streaming-Verarbeitung konsistente Ergebnisse nahezu in Echtzeit zu erreichen. (vgl. [2])



  • Asynchronous Programming for Scala and Scala.js

Monix is a high-performance Scala / Scala.js library for composing asynchronous and event-based programs, exposing high-level types, such as observable sequences that are exposed as asynchronous streams, expanding on the observer pattern, strongly inspired by ReactiveX and by Scalaz, but designed from the ground up for back-pressure and made to cleanly interact with Scala’s standard library, compatible out-of-the-box with the Reactive Streams protocol.

FS2: Functional Streams for Scala (previously ‘Scalaz-Stream’)

FS2 is a streaming I/O library. The design goals are compositionality, expressiveness, resource safety, and speed. The example below is explained at

object Converter {
  import fs2.{io, text, Task}
  import java.nio.file.Paths

  def fahrenheitToCelsius(f: Double): Double =
    (f - 32.0) * (5.0/9.0)

  val converter: Task[Unit] =
    io.file.readAll[Task](Paths.get("testdata/fahrenheit.txt"), 4096)
      .filter(s => !s.trim.isEmpty && !s.startsWith("//"))
      .map(line => fahrenheitToCelsius(line.toDouble).toString)

  // at the end of the universe...
  val u: Unit = converter.unsafeRun()
// defined object Converter


Handling Resources

Handling data files that are part of a scala project, e.g. for testing functionality.

| /project
|-- /src
|-- -- /main
|-- -- -- /scala/timeusage/TimeUsage.scala
|-- -- -- /resources/timeusage/atussum.csv
|-- /target
|-- -- /scala-2.11
|-- -- /classes
|-- -- -- /timeusage/atussum.csv

The compiler will store /src/main/resources/timeusage/atussum.csv at /target/scala-2.11/classes/timeusage/atussum.csv where it can be found using Paths.get(getClass.getResource("/timeusage/atussum.csv").toURI).toString

Scala API Reference

Continuous Integration





scala-scraper: A Scala library for scraping content from HTML pages


select gui project
$ sbt abandon
> project gui
run with accounts.conf
> run -c /home/xps13/Public/abandon/personal/accounts.conf
assemble jar in /target/scala-2.11/
> project abandon
> assembly



import collection.mutable.Stack
import org.scalatest._

class ExampleSpec extends FlatSpec with Matchers {

  "A Stack" should "pop values in last-in-first-out order" in {
    val stack = new Stack[Int]
    stack.pop() should be (2)
    stack.pop() should be (1)

  it should "throw NoSuchElementException if an empty stack is popped" in {
    val emptyStack = new Stack[Int]
    a [NoSuchElementException] should be thrownBy {


If you use sbt add the following dependency to your build file

libraryDependencies += "org.scalacheck" %% "scalacheck" % "1.13.0" % "test"

Put your ScalaCheck properties in src/test/scala, then use the test task to check them

$ sbt test
+ String.startsWith: OK, passed 100 tests.
! String.concat: Falsified after 0 passed tests.
> ARG_0: ""
> ARG_1: ""
+ String.substring: OK, passed 100 tests.

Specify some of the methods of java.lang.String like this:

import org.scalacheck.Properties
import org.scalacheck.Prop.forAll

object StringSpecification extends Properties("String") {

  property("startsWith") = forAll { (a: String, b: String) =>

  property("concatenate") = forAll { (a: String, b: String) =>
    (a+b).length > a.length && (a+b).length > b.length

  property("substring") = forAll { (a: String, b: String, c: String) =>
    (a+b+c).substring(a.length, a.length+b.length) == b




Stack size

run jar with increased JVM stack size
$ scala -J-Xss200m LoopTesterApp

Tail Recursion

Implementation Consideration

If a function calls itself as its last action, the function’s stack frame can be reused. This is called tail recursion. Tail recursive functions are iterative processes.

In general, if the last action of a function consists of calling a function (which may be the same), one stack frame would be sufficient for both functions. Such calls are called tail-calls.

Tail Recursion in Scala

In Scala, only directly recursive calls to the current function are optimized. One can require that a function is tail-recursive using a @tailrec annotation:

def gcd(a: Int, b: Int): Int = ...

If the annotation is given, and the implementation of gcd were not tail recursive, an error would be issued.




run on .scala file
Ctrl + Shift + P then Show Scala Worksheet (bound to Alt + W)

Web Frameworks



Scalatra is a simple, accessible and free web micro-framework. It combines the power of the JVM with the beauty and brevity of Scala, helping you quickly build high-performance web sites and APIs.



install dependencies
$ android sdk update and select in category “Extras” Android Support Repository and Google Repository


  • scalability: seamless operability with Java: call Java methods, access Java fields, inherit from Java classes, and implement Java interfaces
  • compatibility
  • brevity
  • high-level abstractions
  • advanced static typing: static type system classifies variables and expressions according to the kinds of values they hold and compute
  • functional: more elegant programs, parallel, concurrent,
  • functions can be defined anywhere, including inside other functions; functions are like any other value, they can be passed as parameters to functions and returned as results; as for other values, there exists a set operators to compose functions

In Java, you say everything three times, in Scala you say everything one time. Closures (function values) arrived in Java 8 - in Scala existing from the beginning.

val people: Array[Persons]
val (minors, adults) = people partition (_.age < 18)

to make it parallel: val (minors, adults) = people.par partition (_.age < 18) (collection)

Akka framework

Akka Streams


Typesafe Activator

Slick for database access

Play Slick Module


modify conf/application.conf to use a file-based database
db.default.url="jdbc:h2:/path/to/file e.g. jdbc:h2:data/test will generate /data/test.h2.db in application root

play slick evolution default apply prompt

browse file-based database using activator (starting Web Console server at
$ activator -> h2-browser

play slick h2 browser connect

edit table simultaneously from h2-browser Web Server console and play application running at localhost:9000
query table SELECT * FROM CAT name
edit table @edit SELECT * FROM CAT name

play slick h2 browser connect

Play Framework

creating an application
$ activator new
starting an application
$ cd play-scalajs-showcase
$ activator run
launch activator desktop
$ activator ui

Play libraries

Deployment General




Activator template for a Play, MongoDB and knockout.js application

Starting with ReactJS, Play 2.4.x, Scala and Anorm


Geographic data processing



GeoTrellis Transit

GeoTrellis Chattanooga Model Demo

Spray-based web application that uses GeoTrellis to do a weighted overlay and zonal summary over land raster data for a project that was completed for the University of Chattanooga at Tennessee

$ cd ~/scala/geotrellis/geotrellis-chatta-demo/geotrellis
$ sbt run


change port in src/main/scala/Main.scala
IO(Http) ! Http.Bind(service, interface = "localhost", port = 9000)
start application
$ cd ~/scala/geotrellis/geotrellis-spray-tutorial
$ sbt
$ run
demo REST queries in src/main/scala/GeoTrellisService.scala
  • http://localhost:9000/ping

  • http://localhost:9000/raster/SBN_inc_percap/draw


  • http://localhost:9000/raster/SBN_inc_percap/stats
{mean: 26.62950322073986, histogram: [[2,34170],[4,43733],[6,17490],[8,47435],[10,53341],[12,122251],[14,82596],[16,96651],[18,183408],[20,95711],[22,98655],[24,89779],[26,78115],[28,101714],[30,102065],[32,41440],[34,72890],[36,42396],[38,33185],[40,11975],[42,26356],[44,36012],[46,18198],[48,20421],[50,28087],[52,27513],[54,6932],[56,9250],[58,17249],[60,6554],[62,17552],[64,19696],[66,1598],[68,3993],[70,8216],[72,2563],[76,13021],[78,2813],[80,550],[82,7726],[86,3025],[88,317],[92,2843],[94,1385],[96,2860],[98,556],[100,2582]] }
  • http://localhost:9000/raster/SBN_farm_mkt/draw


  • http://localhost:9000/raster/SBN_farm_mkt/mask?cutoff=1


  • http://localhost:9000/analyze/draw?cutoff=1


  • http://localhost:9000/analyze/stats?cutoff=1
{mean: 22.955766888898363, histogram: [[2,10576],[4,22930],[6,7207],[8,8310],[10,10010],[12,13177],[14,12758],[16,9399],[18,18027],[20,4039],[22,5779],[24,9623],[26,4040],[28,10650],[30,9280],[32,5161],[34,6067],[36,3497],[38,4232],[40,1171],[42,6380],[44,5903],[46,2785],[48,1917],[50,1504],[52,5465],[54,892],[56,468],[58,1336],[60,1220],[62,803],[64,1342],[66,467],[68,620],[70,956],[72,135],[76,373],[78,57],[80,59],[82,140],[86,22],[88,31],[92,1095],[96,223],[100,915]] }


An extension to the core Scala library for functional programming.

It provides purely functional data structures to complement those from the Scala standard library. It defines a set of foundational type classes (e.g. Functor, Monad) and corresponding instances for a large number of data structures.


Let the types speak for themselves via the Scalaz Scaladocs!

The examples module contains some snippets of Scalaz usage.

The wiki contains release and migration information.

The typelevel blog has some great posts such as Towards Scalaz by Adelbert Chang.

Learning Scalaz is a great series of blog posts by Eugene Yokota. Thanks, Eugene!


to run code with Node.js (instead of Rhino)
$ sbt
> set scalaJSStage in Global := FastOptStage
> last
get stack traces resolved on Node.js
$ npm install source-map-support
create single JavaScript file from sbt
> fastOptJS or recompile as needed > ~fastOptJS
after changes to build.sbt reload the sbt configuration
Hit enter to abort the ~fastOptJS command
Type reload
Start ~fastOptJS again
enable testing with uTest (depends on phantomJS)
install phantomJS $ npm install -g phantomjs
$ sbt
> run
|- project/
|-- build.sbt
|-- src
|--- main
|---- scala
|----- program.scala
|--- test
|---- scala
|----- test.scala



Scala.js Single Page Application (SPA)

Scala.js Tutorial


WOOT model for Scala and JavaScript via Scala.js

$ cd ~/scala/wootjs
$ sbt "project server" run or $ sbt server/run


Use Scala to define your tasks. Then run them in parallel from the shell.

Customizing paths

unmanaged source directories
scalaSource and javaSource

Organizing Build

  • project/Dependencies.scala to track dependencies in one place

Cross-build projects

Multi-build projects

list projects from sbt
> projects
select specific project
> project [project name]

Project structure

|- project/
|-- build.sbt
|-- program.scala
make sure scala version is the same as installed (version printed when running $ scala from Linux shell)


install conscript
$ curl | sh
install giter8 templating system
$ cs n8han/giter8
fetch template
execute in new project folder $ g8 typesafehub/scala-sbt


create project/EnsimeProjectSettings.scala

import sbt._
import org.ensime.Imports.EnsimeKeys

object EnsimeProjectSettings extends AutoPlugin {
  override def requires = org.ensime.EnsimePlugin
  override def trigger = allRequirements
  override def projectSettings = Seq(
    // your settings here
  • run sbt $ sbt or from Emacs M-x sbt-start
  • compile project > compile
  • generate .ensime for the project
  • generate project/.ensime for the project definition
  • add .ensime, project/.ensime and project/EnsimeProjectSettings.scala to .gitignore

SBT Commands

The commands become available after adding ensime to the global SBT plugins (Windows: C:\Users\[username]\.sbt\0.13\plugins.sbt, Linux: ~/.sbt/0.13/plugins.sbt)

Generate a .ensime for the project (takes space-separated parameters to restrict to subprojects)
ensimeConfig (previously gen-ensime)
Generate a project/.ensime for the project definition.
ensimeConfigProject (previously gen-ensime-project)
Add debugging flags to all forked JVM processes.
> debugging
Remove debugging flags from all forked JVM processes.
> debugging-off




We show a simple Scala code example for ML dataset import/export and simple operations. More complete dataset examples in Scala and Python can be found under the examples/ folder of the Spark repository. We refer users to Spark SQL’s user guide to learn more about SchemaRDD and the operations it supports.



2016 Video Replays


Creative Scala

$ cd ~/Dropbox/GitHub/creative-scala
$ npm install
$ grunt watch and navigate to http://localhost:4000



Functional Programming Principles in Scala

by Martin Odersky, École Polytechnique Fédérale de Lausanne

Principles of Reactive Programming

by Martin Odersky, Erik Meijer, Roland Kuhn, École Polytechnique Fédérale de Lausanne

Installation Windows

  • set JAVA_HOME
  • create .bat script
set SCRIPT_DIR=%~dp0
rem java -Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256M -jar "%SCRIPT_DIR%sbt-launch.jar" %*
%JAVA_HOME%\bin\java -Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256M -jar "%SCRIPT_DIR%sbt-launch.jar" %*

Installation Fedora 22

install sbt
curl | sudo tee /etc/yum.repos.d/bintray-sbt-rpm.repo
sudo dnf install sbt

this will create a link in /usr/bin/sbt

in order to call sbt from the command line, add the following to ~/.profile:

export SCALA_HOME="/usr/share/scala"
export SBT_HOME="/usr/share/sbt-launcher-packaging"
export PATH="$PATH:$SBT_HOME/bin:$SCALA_HOME/bin


go to test project folder
cd /home/xps13/Dropbox/Programming/Scala/hello2

create hw.scala if it doesn’t exist already

object Hi {
  def main(args: Array[String]) {
generate directory for classes
mkdir classes
generate class files in output directory (classpath)
scalac -d classes hw.scala
execute bytecode with classpath
scala -cp classes Hi returns Hi!

generate shell script

exec scala "$0" "$@"
object Hi extends App {
make executable and run
chmod +x


  • if sbt cannot find all dependencies, check files in project subfolders for sbt.version=0.13.x and update with installed sbt version
  • installed sbt version can be checked by running sbt sbtVersion from within a non-project directory


navigate to directory
$ cd ~/Dropbox/Programming/Scala/prog-in-scala/ch4/
compile source code to class files
$ scalac ChecksumAccumulator.scala Summer.scala
using Scala compiler daemon called fsc
$ fsc ChecksumAccumulator.scala Summer.scala
stop the fsc daemon
$ fsc -shutdown


  • Scala Days
  • BeeScala
  • w-jax Die Konferenz für Java, Architektur- und Software-Innovation
  • topconf
  • Scala Italy


16 August 2015