Showing posts with label Programming. Show all posts
Showing posts with label Programming. Show all posts

Monday, December 10, 2012

Cool Monday: Exploration of dynamic db acces from scala

I use scala on Android and I don't like the integrated database API. It's very verbose and very stateful. I had written my own ORM(DAO would be a more appropriate tag) a while back, before I used scala but it's not enough anymore. So now I'm on a quest for a better database API. My dream is something small that handles schema for me and is type-safe. A nice DSL that is converted to SQL at compile time  and does code generation. So it's fast like hand writing everything. But reduces code footprint by an order of magnitude(at least). Scala SLICK looks promising. It fits most requirements. But it's kinda big for android projects(you need scala library too!) and has not yet hit a stable version so I wouldn't be comfortable shipping it. Will definitely give it a thorough test when scala 2.10 is stable and SLICK is released. Oh, and it needs a third party JDBC  driver for Android. This is another level of abstraction and therefore another source of slowness. I contemplated writing my own clone targeted at Android but   never came around to actually doing it(yet!). It seems like a herculean task for single developer working in spare time.

Meanwhile

Yesterday I stared thinking how dynamic languages handle databases. And I got an idea. Scala has type Dynamic that does compilation magic to provide syntactic sugar for working with dynamic languages or objects. Here's an idea: do queries in plain SQL and perform extraction of data in a dynamic way. 
And how to do this? Just wrap up Cursor to provide necessary methods. 
class WrappedCursor(cursor: Cursor) implements Cursor{
  //delegated methods go here
}
Why I need this? Cake pattern of course, Dynamic cursor get's mixed in.
trait DynamicCursor extends Dynamic{ this: Cursor =>

  def selectDynamic(name: String) =
    getColumn(getColumnIndex(name))

  def getColumn(index: Int) =
    getType(index) match {
    case Cursor.FIELD_TYPE_BLOB => getBlob(index)
    case Cursor.FIELD_TYPE_FLOAT => getDouble(index)
    case Cursor.FIELD_TYPE_INTEGER => getLong(index)
    case Cursor.FIELD_TYPE_NULL => null
    case Cursor.FIELD_TYPE_STRING => getString(index)
  }

  def toSeq = (0 until getColumnCount) map getColumn
}
I targeted API level 14(Ice Cream Sandwich) since getType(method on Cursor) is available from 11 on.    Key method here is getColumn that abstracts over types. So you can read a column and  do pattern matching on it. Or you are evil and use implicit conversions from Any to String, Long etc... Or use implicit conversion to "converter"
implicit class Converter(val value: Any) extends AnyVal{
  def blob = value.asInstanceOf[Array[Byte]]
  def double = value.asInstanceOf[Double]
  def long = value.asInstanceOf[Long]
  def string = value.asInstanceOf[String]
}
But the real deal is selectDynamic. This allows you to write code like this
val c = new WrappedCursor(result) with DynamicCursor
c.someColumn.long
This compiles down to selectDynamic("someColumn") that calls getColumn and finally implicit conversion is inserted that allows for terse cast to Long.
And I threw in a conversion from row to Seq that does a snapshot of current row. This allows pattern matching on rows. Any you can now construct a Stream that will handle Cursor state and lazily evaluate and store these snapshots. Therefore you can abstract away all mutability and handle cursor as immutable collection.

Said conversion to stream
def CursorStream(cursor: DynamicCursorRaw with Cursor) = {
  def loop(): Stream[Seq[Any]] = {
    if(cursor.isAfterLast)
      Stream.empty[Seq[Any]]
    else {
      val snapshot = cursor.toSeq
      cursor.moveToNext()
      snapshot #:: loop()
    }
  }
  cursor.moveToFirst()
  loop()
}
And some more implicits to help
implicit class RichCursorRaw(cursor: Cursor) extends AnyVal{
  def dynamicRaw = new WrappedCursor(cursor) with DynamicCursorRaw
  def toStream = CursorStream(dynamicRaw)
}
All the source is in the project on github https://github.com/edofic/dynamic-db-android (work in progress).



Enhanced by Zemanta

Tuesday, December 4, 2012

Homework - functional style (outer sorting)

I'm attending Algorithms and data structures class this semester. Material it self is quite interesting and one TA is pretty cool too. But I don't like professor(makes whole experience very much worse) and I believe homeworks could be much better. Oh, and we didn't even mention functional approach...you know Haskell, Scala and the like. All we do is imperative, C-style code in Java. Enough ranting. This is how it saw the bright side.

Le problem

We were doing outer sorting. More specifically: balanced natural outer merge sort. I hope I translated this right(probably not). In it's essence the algorithm looks like this

  • you have multiple tracks you read from and write to
  • you have the current element of each track in memory
  • you write out squads(non-descending sub-sequence), this means you take the minimum element that is greater than last of if such element doesn't exist you take the minimal element
  • every time a squad ends your write pointer hops to the next track.
  • repeat until all elements are on single track(hopefully sorted in non-descending order)
Reel of 1/2" tape showing beginning-of-ta...
Reel of 1/2" tape showing beginning-of-tape reflective marker. (Photo credit: Wikipedia)
Quite simple right? TA's even provided us classes(talking Java here) InTrack and OutTrack to manage tracks and I/O. Well my problem is that I grew to dislike imperative style. Surely it may matter for performance, but since this is a homework, performance didn't matter - so I wrote pretty code. I wanted my central code(the heart of the algorithm) to be a few lines at most. 
This is my final product(bear in mind there was an additional twist: code should also be capable of sorting in non-ascending order thus the up variable).
Some additional explanation: all tracks should be in separate files(no overwriting - for automatic checking). N is number of tracks, prefix is track name prefix, and i is current iteration.
int i = 0;
Iterable<Integer> source = new InTrack(inName);
MultiSink sink;
do{
    sink = new MultiSink(prefix, i, N, up);
    for(int n : source) sink.write(n);
    source = new MultiSource(prefix, i, N, up);
    i++;
} while (sink.isMoreThanOneUsed());
I'm, probably not allowed to share my full solution because of university rules so I won't.
Now to comment on this. I have a source that's agnostic to the amount of open files. And I have a similar sink. End-point switching is implemented in MultiSink and element choosing is in MultiSource. Both InTrack and MultiSource implement Iterable(and Iterator) so I can use them in a for-each loop. And code is as pretty as I can get it(remaining in java). All in all ~300 lines(with InTrack & stuff). After removing uneeded utility methods and comments ~220 lines. Eww.. Thats way too much.

Scala to the rescue

Lets rewrite this in a functional matter using scala. And while I'm at it, no vars or m
Scala (programming language)
Scala (programming language) (Photo credit: Wikipedia)
utable collections. 
Input can be a collection right? Just implement Traversable. Not really. The whole point of tracks is they only hold one element in memory(or a few for efficiency but that's currently not my concern). So a track can be implemented as a Stream(linked list with a lazy val for tail).
def Reader(filename: String) = {
  val sc = new Scanner(new File(filename))
  def loop(): Stream[Int] = {
    if (sc.hasNextInt){
      sc.nextInt() #:: loop
    } else
      Stream.empty[Int]
  }
  loop()
}
This is the constructor function for the input stream. It just returns a recursive value that has a Scanner in its closure. As stream elements are immutable you get an iron clad guarantee that sc will stay in sync.   And you get all collections stuff for free. Moving on, how to abstract over multiple streams? That should be a stream again right? I kinda feel my code is too complicated and that it could be done simpler but that's what I came up with
def MultiReader(prefix: String, phase: Int, N: Int, up: Boolean) = {
  def loop(last: Int, sources: Seq[Stream[Int]]): Stream[Int] = {
    val nonEmpty = sources.filterNot(_.isEmpty)
    if(nonEmpty.length==0)
      Stream.empty[Int]
    else {
      val (low,high) = nonEmpty.
        map(_.head).zipWithIndex.
          partition(t => up && t._1 < last || !up && t._1 > last)
      val (e,i) = (if(high.length>0) high else low).minBy(_._1)
      e #:: loop(e, nonEmpty.updated(i, nonEmpty(i).tail))
    }
  }
  loop(0, (0 until N).map(n => Reader(prefix + "-" + phase + "-" + n)))
}
Let's walk through. Again the stream is recursive. It starts with a collection of Readers set to right files. Then in each step you filter out empty stream(tracks with no more elements) and partition them according to the last element(in the argument). If there are higher you take their minimum else you take the minimum of lower. And loop with passing on the read element and a new collection - non empty streams with the read one advanced by one element.

Writer was a bit trickier. It needs internal state, but I prohibited mutable state. Solution is to return a new Writer containing a new state every time you write. Then the user must just be careful not to use stale Writers - not that big a deal.
This is the Writer trait
trait Writer{
  def write(num: Int): Writer
  def moreThanOneUsed: Boolean
}
Very simple interface. And here's the recursive constructor function
def Writer(prefix: String, phase: Int, N: Int, up: Boolean): Writer = {
  val tracks = (0 until N).map(
  n => new PrintWriter(
    new BufferedWriter(
      new FileWriter(prefix+"-"+phase+"-"+n))))
  def mkWriter(i: Int, last: Int, used: Boolean): Writer = new Writer{
    def write(num: Int) = {
      val (ni,nu) =
        if (up && num < last || !up && num > last)
          ((i + 1) % tracks.length, true)
        else (i, used)
      tracks(ni).print(num)
      tracks(ni).print(' ')
      tracks(ni).flush()
      mkWriter(ni, num, nu)
    }
    def moreThanOneUsed = used
  }
  mkWriter(0, if (up) Integer.MIN_VALUE else Integer.MAX_VALUE, used=false)
}
Creates all the tracks to be put in the closure. First writer has the proper start value then every next is constructed like this: figure out the new values for track number and 'used'(the long if) then actually write out and return a new writer encapsulating track number and 'used'. Since these writers are quite lightweight garbage collection pressure shouldn't be a problem. Especially since the whole process is bound by I/O. Anyway you could optimize by creating all possible states in advance and just passing a new reference each time.

Putting it all together.
def loop(i: Int, source: Stream[Int]){
  val sink = source.foldLeft(Writer(prefix, i, N, up))(_ write _) 
  if (sink.moreThanOneUsed) loop(i+1, MultiReader(prefix, i, N ,up))
}
loop(0, Reader(inName))
So you take an input stream, fold it over a writer writing in each step. And if you used more than one track you repeat with a new input stream.
I find this solution to be MUCH MORE elegant. Not to mention it's just 65 lines of scala. But it makes me really sad they don't even mention functional programming at algorithms course. I'm probably gonna pay the professor and TA's a visit in near future.


Enhanced by Zemanta

Friday, November 23, 2012

Nomadic programming

DISCLAIMER: This is about my opinion. And may or may not contain some boasting. And is also a bit of a brain dump.

Yep, this is not a typo. Not monadic but nomadic. Although monads are cool too.
A few days ago the CTO of a company I work for said it's time to specialize. He was talking about my career. Offering me a few things to try and then pick one. But this got me thinking.
Today another CTO told me that I'm "essentially a good dev" and that chosen software stack doesn't matter. And I agree. (Patrick does too).
20 years later all this fits in your pocket
I'm an engineer not a programmer(or god forbid, a coder). I solve problems not just write code. It just happens that code is solution to most problems I try to solve. World is increasingly more computer orientated. Now everything(literally!) has a processor inside. But processors are dumb. So you need people to program them. These people are the equivalent of the factory workers before robots started taking over. So now people need to migrate one level up on the abstraction ladder. Labor is now mental. Not everything is the same(like in a traditional factory) but it is the same concept. And you are pumping out slightly different versions.  Welcome to a modern software shop.
But all this "computerization" also pushes up the upper bound of complexity. And this is where engineers kick in. If you have complex solution you require (hopefully less) complex solutions. And you need people who can manage this complexion. Not by being machines but by understanding, inventing concepts and structuring them. Big software has millions if not billions of moving parts. Imagine a machine of such complexity. There probably isn't a human out there who could manage this in it's entirety. But software engineers are expected to do this. You need the ability to see something and relate it to known concepts or invent a new concept and relate it to oter stuff. Connect things together. And to do this well you need broad knowledge. Yes you need specialization - depth - to do something very well but this is a lesser problem. See a good engineer can go from zero to awesome in a new field very quickly. Learning a new language can be done in a week and a software stack in a bit more. I'm not saying you become an expert or the very best in the field but if you have the urge to learn you are not far behind. So a good engineer is the one who has the ability to learn and to produce real solutions with this knowledge.


Nomads

So where does nomadic programming fit into this philosophy? Going from language to language, from framework to framwork and changing software stacks. Not settling down. This (in my opinion) teaches you what can be done and haw can stuff be done. In all possible ways. So you can figure out the best solution. Learning all the time also keeps you on the bleeding edge and this is fun and engaging. Having new toys all the time! I can't image myself working with just one thing for the rest of my life. Meeting new languages and new framework is what I do in my free time. So when a customer says they need Ruby on Rails and I never used it before....it's not a problem. I've seen ruby, I know python and I know how MVC is supposed to work. Putting it all together is not that hard. You can even hit the ground running and start working right away. Of course you'll be a bit slower in the start but you can catch up.  So no, I don't want to specialize. I want to know everything that is there to know! And I believe this will make me a better engineer. Or developer. Or person. And I believe that quality of their employees shoud matter to the employer. It is developers who create their products. 

You should always strive to become better!
Enhanced by Zemanta

Monday, November 12, 2012

Cool Monday: Scala Macros

Garden flower
Macro shot (Photo credit: Wikipedia)
For me the highlight of this week was discovering Bootstrap. I heard of it before but never looked into it. Probably because I wasn't doing web stuff. The thing is bloody awesome. Back on topic.

Scala 2.10 RC2 was released this Friday. Considering 2.9 had 3 RC releases, 2.10 final is probably quite near. And it brings some awesome features. One of them are macros

Macros

So what are macros basically? Code executed at compile time. And that's about it. 
So what is so great about that? Well you can do AST modifications and stuff that gets quite into compiler-plugin territory in your regular code. That means you can do pretty advanced stuff and do abstraction with performance. Oh yeah, you can also do type checking and emit compile-time errors. Safety first kids!

Usages

SLICK uses macros(in the experimental API) to transform scala expressions into SQL. At compile time! ScalaMock uses it to provide more natural API for testing. As said, you can use it for code generation or validation at compile time. Good library design will be able to minimize boilerplate code even further now. And some people will argue that macros make scala even harder language.

Type Macros

This is the best part for me. But unfortuntely it's not implemented yet. Or at least not dcumented. There are some methods with suspisious names in the API but no useful documentation. In all presentations this is referred to as "future work" but I still have my fingers crossed it makes it into final release.
So what's the fuss? Automatically generated types. Large scale code-gen. 
As in ability to programatically create types at compile time. As a consequence you can create whole classes with bunch of methods. And I already have a use case of my own. I want to make a typesafe ORM for Android that's super fast. I did YodaLib ORM while back. It uses reflection(although it's fast enough usually) and provides a Cursor that lazily wraps rows into classes. And you need to make sure by hand that your class coresponds to columns of your result set. Not very safe. I had an idea to make static-typed safe inferface for the database when I first heard about HList. You would do projection as a HList and result rows would be lazily wrapped into HLists. But using them for wrapping every row(possibly filling data in with reflection) would be a performance penalty. Not to mention a mess to implement. Now consider using a macro to generate code for wrapping. It would be no slower than accessing columns by hand. And a type macro would automatically create a case class for given projection. Heavens. I'm just waiting for official documentation on macros...this is a tempting project.
(oh yeah, it would manage your scema too, so you don't need to worry about consistency between your code and your schema)

Documentation

Here's scalamacros.org which gives some information. Also some quite useful slides. I hope now that 2.10 is in RC things stabilize, because in the milestone releases api was changing constantly. Nightly API scaladoc.... Proper documentation is apparently coming soon,

Le Code

A use case for macros, loop unrolling.
Below is a trivial sample of repetitive code.
class Manual{
    def show(n: Int){
        println("number "+n)
    }
    
    def code(){
        show(1)
        show(2)
        show(3)
        show(4)
        show(5)
    }
}
We can deal with repetition writing a loop. (Higher order function really)
for( i <- 1 to 5 ) show(i)
But this doesnt generate the same AST(and bytecode)!
Protip: use
scalac -Xprint:parser -Yshow-trees Manual.scala
to see AST after parsing.
Sometimes(rarely!) you want to unroll the loop to produce same byte code as typing all the iterations by hand.
Macros.unroll(1,5,1)(show)
With a proper unroll macro defined. I spent an hour to come up with this implementation...and then scalac started crashing on me... Is there something terrible in my code?
I gave up and went on to do useful stuff...But macros hear me! I'll be back.
import reflect.macros.{Context}
import scala.language.experimental.macros

object Macros {
  def unroll(start: Int, end: Int, step: Int)(body: Int => Unit) = 
    macro unrollImpl

  def unrollImpl(c: Context)(start: c.Expr[Int] ,end: c.Expr[Int], step: c.Expr[Int])(body: c.Expr[Int => Unit]): c.Expr[Any] = {
    import c.universe._
    val Literal(Constant(start_value: Int)) = start.tree
    val Literal(Constant(end_value: Int)) = end.tree
    val Literal(Constant(step_value: Int)) = step.tree

    val invocations = Range(start_value, end_value, step_value) map { n =>
      val n_exp = c.Expr(Literal(Constant(n)))
      reify{
        ((body.splice)(n_exp.splice))
      }.tree
    }
    c.Expr(Block(invocations:_*))
  }
}
Enhanced by Zemanta

Friday, November 2, 2012

Hunt for a web framework that works

Never Internet Explorer
Never Internet Explorer (Photo credit: Wikipedia)
I have this personal project I want to do that includes a web application and I want to learn something. So I'm on the hunt for language, environment and framework.

Other stuff

I did some PHP a few years back and definitely don't want to go there anymore. I also did some .NET and it's even part of curriculum here at FRI. But clicking on wizards in Visual Studio feels weird to me. Not like development should be done. And I also use GNU/Linux as my primary(and only) OS, so that's out of the water. I did read about java server pages and faces and even tried few things out. But luckily I didn't get to do this project I was preparing for and I didn't need it. It looked ugly anyway. I did some flirting with GWT, does that even count as a web framework? 

Node.js

I heard about nodejs quite some time ago but I put off looking into it because my js was really rusty. But recently I brushed up on my javascript skills(to do a "compiler" into js) and gave it a shot. Node is good. It's fast, it's agile, it makes you think in a different way. I was feeling empowered. I did some simple stuff and I liked it.

Static vs dynamic

Later I kinda got a job as Ruby on Rails dev. And I hated it. So I didn't take it. It would take up too much time anyway - I'm a student. Ruby is okay. Rails is okay. But problem was the size of the problem. Application we were buiding(a team of devs) was quite complex and I came into existing(moderate size) ruby code base. Learning ruby and rails as I go was fun, but navigating the project was pain in the ass. Of course documentation was non existent and IDE couldn't help me because it didn't know. So a lot of regex searching and walking around asking stuff. Also refactoring...Inevitable but hard. 
This cemented my opinion on static vs dynamic typing. (Static for everything but a short script, more on that another time).

Scala

Then I learned about the good parts of static typing through scala and haskell. Doing web in haskell seems a bit intimidating(I will give it a go eventually, I promise) so I roll with scala. I looks there are two big names here. Play! and Lift. I watched a few talks and read few blogs about both to see central points. 
Big difference seems to be their view on state. Lift goes for stateful, Play for stateless. Play kinda seems like it has a bigger community, but their documentation is stellar and they're now part of Typesafe stack. No brainer then. Play it is.

Play! framework

I dived into documentation. Reading samples and explanation about infrastructure and APIs. Samples really clicked with me - it felt like porn. No analogy, reading elegant scala sources for a web app for my first time felt like I was doing something naughty, like things shouldn't be that good.
Live reloading is great too. A friend of mine is a J2EE dev and he's constantly nagging about build and deploy times. I get that near instantaneous. And compile time checking of routes and templates? Oh my god, yes. Bear in mind, compile time is all the time. When I hit ctrl-s for "save all open files" I quickly see if compiler has any complaints, even before I refresh the browser. 
I just did some experimenting with features then...for a few hours. Everything feels so simple but powerful. Why nobody told me about this before?
Okay, it has to have some weaknesses but I didn't find them. Yet. And that's what counts.

Heroku

Now this is just a cherry for the top of my cake. It took me two minutes to deploy my hello world app, and that includes time needed to install Heroku's tookkit. You just create an app and push to remote git repo. Heroku detects it's a Play/Scala app and install dependencies. Rest is done by SBT. And it just works. Hassle free deployment for developers. Yay.

Now I have my stack and even a host. So I just need to write an awesome service and generate traffic. How hard can it be?
Enhanced by Zemanta

Monday, October 29, 2012

Cool Monday: HList and Shapeless

Java (programming language)
Java (programming language) (Photo credit: Wikipedia)
HList as in heterogenous lists. This means every element is of different type. Yeah sure, just list List<Object> in Java, but that is in no way typesafe. I want compiler to know the type of every element and stop me if I try to do something silly.

Linked lists to the rescue

So what's a linked list anyway? A sequence of nodes with pointers to next. And a nice implementation(still talking Java here) would be generic to allow type-safety for homogeneous lists. It turns out generic are solution for HLists too. Just introduce additional type parameter. Apocalisp has a great post on implementing them in Java. Here's just a factory method to see the gist
public static <E, L extends HList<L>> HCons<E, L> HCons(final E e, final L l) {
   return new HCons<E,L>(e, l);
}
Problem comes with instantiation.
final HConsInteger[], HNil>>> b =
      cons(4.0, cons("Bar", cons(new Integer[]{1, 2}, nil())));
Java requires A LOT of type annotation. It works but it's just painful and it doesn't pay off.

Type inference to the rescue

Type inference gets rid of this problem entirely. Let's implement whole working HList in scala.
abstract class HList[H,T<:HList[_,_]] {
  def head: H
  def tail: T
  def ::[A](a: A) = Hcons(a, this)
}

object HNil extends HList[Nothing, Nothing]{
  def head = throw new IllegalAccessException("head of empty hlist")
  def tail = throw new IllegalAccessException("tail of empty hlist")
}

case class Hcons[H,T<:HList[_,_]](private val hd: H, private val tl: T) extends HList[H, T]{
  def head = hd
  def tail = tl
}
So this list can be instantiated like this
scala> val myHList = 1 :: "hi" :: 2.0 :: HNil
myHList: Hcons[Int,HList[java.lang.String,HList[Double,HList[Nothing,Nothing]]]] = Hcons(1,Hcons(hi,Hcons(2.0,HNil$@dbb62c)))

And it just works. Scala compiler does all the heavy lifting with type annotations. This implementation bare bones and doesn't provide any useful methods(even random access!). Check out Miles Sabin's shapeless project for a useful implementation and much more. I provides indexing, map, fold, concatenation, type-safe casts, conversions to tuples(and abstracting over arities!) and back. And even conversions with case classes. Just click the link above and read the readme. It's awesome.
Enhanced by Zemanta

Monday, October 22, 2012

Setting up for scala development on Android

Image representing Android as depicted in Crun...
Image via CrunchBase
Scala (programming language)
Scala (programming language) (Photo credit: Wikipedia)








I've been developing for android more than a year now and a few months in scala. So naturally I wanted to combine the two. But it's not dead simple. This is kinda a tutorial an a reference if I ever forget how to do this. It took me a few days to figure it all out. I tried maven, ant with special config and sbt(I need to learn more about this one) but in the end I just wanted fast solution integrated into my IDE.

So I use IntelliJ IDEA community edition for my IDE.  You should check it out, it's totally awesome. It's primary a Java IDE but scala plugin rocks. It offers some more advanced text editing capabilities, not like vim or emacs but enough for me. It also brings up coloring and editing features that are language aware. So you have a shortcut(Ctrl-W) to select semantically valid block. And press it again to expand to next bigger valid piece of code. And stuff like that. Real-time structure view is nice and there are some cool refactorings. But scala REPL is where fun begins. You get your module classpath pre-set and you get full editor capabilities in REPL. Enough with advertisement(they didn't pay me to do this) and let's get to work. 

Prerequisites

  • JDK...duh!  I use OpenJDK 7, IDEA gives some warnings but it works like a charm
  • Android SDK and at least one platform
  • IntelliJ IDEA
  • scala distribution. I recommend you use latest stable release from here 

Setting up

First install scala plugin. It's quite straightforward. Plugin Manager->Browse repos->search for scala->select->ok.
Now actual setting up. I use global libraries for all my projects, you can also put these into just Libraries and to that on per-project basis.
Open project structure(no project open) and go to Global Libraries. You need to create two libraries containing jars from <path to scala>/lib/.
First scala-compiler with scala-compiler.jar and scala-library.jar and then scala-library with scala-library.jar and anything else you might need. Reason for scala library in compiler is that compiler also relies on scala lib. I needed quite some time to figure this out.
This whole process can be automated if you add scala to your project when creating it but it's not possible with android so you need to know how to do it by hand. 

Creating a project

  • Project from scratch
  • add android module and configure it
  • now go to project structure. add scala facet to this module and go to its settings and set the compiler jar.
  • back to module and add dependency to global scala-library
  • set dependency to provided. This is important. Else it will try to dex whole library and you'll end up with "too  many methods error".
Now your project should compile. But not run.

Running

Obviously not including scala library in the build means you need to provide it in another way. For developing on emulator I customized it to provide predexed scala library. Excellent tutorial.
In a nutshell
$ git clone git://github.com/jberkel/android-sdk-scala.git
$ cd android-sdk-scala
$ ./bin/createdexlibs
$ bin/createramdisks
$ emulator -avd ... -ramdisk /path/to/custom.img
$ adb shell mkdir -p /data/framework
$ for i in configs/framework/*.jar; do adb push $i /data/framework/; done
And reboot.

There is also (not so trivial) part about pathcing a device. However you can try Scala Installer from Play to do this. I had some success and some failures.

Now the app should run on your device.

Deploying

Well it doesn't work on other devices right now. For export you need to change scala-library dependency back to compile to include it into the build. Trick now is to enable ProGuard to remove unnecessary methods and classed to fit the jar through dexer. You do this in Tools->Android->Export. Select ProGuard and your config. I got mine from jberkel's repo. That's it. Sadly this export takes quite some time. Scala's standard library is not a piece of cake afterall(actually it is a cake). Minute and a half on my machine for small apps. So I only to this for testing on other phones and deployment.

Faster compilation

Compiling with scala-libray set to provided is much faster but not fast enough for me. I want to be doing stuff not waiting for it to compile
Turns out compiler is the big time sucker(and I'm being Capt. Obvious). Afterall scalac is not known for it's speed.
Enter FSC or Fast Scala Compiler. This is a scala compiler running in the background having everything preloaded and just does incremental compilation. It even comes with standard scala distribution and is supported by IntelliJ IDEA. Great. 
To set it up just head over to Project Structure->Scala facet and select Use FSC. And then immediately click Setting to access Project Settings and set compiler jar for the compiler.
Success. Scala builds are now on par(or even faster!) than java ones. 
No more fencing for me.
Enhanced by Zemanta

Thursday, October 18, 2012

Virtual machine in C(++)

This is not a tutorial. This post is a flashback I had today. It might be a bit fiction as my memory about events tends to be fuzzy at times. But I promise it at least resembles the real story.
I was in elementary school and just found out about programming and was learning about c++. After reading "C++ na kolenih" by Goran Bervar I was empowered by knowledge and tried to do all sorts of projects. Mostly in console. Stuff like subtitle format converter - NIH syndrome. I was a bit frustrated because I couldn't find any books about windows programming in the library. Yes, library was may primary source of information, because my English was not nearly good enough for technical stuff.
I might add here I worked on Windows 98(and later XP) with DevC++. I found out about Visual Studio in a few years and did some Windows development.
I digressed a bit. Then came the most optimistic idea. A virtual machine. Something quite high level(instruction to print) an eventually an assembler. I now realize I was always into language stuff. So a designed a machine language with just enough instructions to do Hello World, that is PRINT and END.

Implementation

At first I thought about doing a monolithic structure - switch case(in fact what I've done with scrat recently). But I had some considerations. What if number of of instruction rises a lot? I'll be left maintaining spaghetti code. Or at least I thought that's what spaghetti code looks like, but in retrospective I believe I had a good taste anyway. 
But I tried that anyway. Just for kicks. Did whole machine as one class that had an array for memory and a single point of entry - boot. It run a loop a while loop with PC<-PC+1, fetched instruction from memory, switched on them, called appropriate method to implement that instruction and looped. Even had registers. I think my current professor of Computer Architecture(this course brought back the memory) might actually be proud if he heard what I did back then. 

Pointers

I was always quite comfortable with pointers. I don't now, they're mathematicky concept. I like such stuff. Or perhaps it was because I was young when I was introduced into the matter and wasn't spoiled with automatic memory management(which I quite like nowadays). 
So I tried with function pointers. C is cool enough to let you have pointers to functions! And that means higher order functions. But I didn't know about math enough to appreciate the concept as I do now. But still - I thought it's extremely cool. So I did a function that halted execution and printed out "no such instruction". Why you ask? Well I did a 256-cell table(8-bit instruction) of pointers to functions. Now I didn't have to switch - just a look-up and invocation. Great. Apart from the fact it doesn't work. 
Compiler said something along the lines of "You cannot make a table of pointers to functions!". I was puzzled. Skip 10 years into the future. Today I was rethinking this and thought about casting the pointer. All the functions would be void->void so I can cast back no problem. A table of void pointers and casting. Yay!
Now 10 years back. I didn't think about casting the pointer. Type info was sacred to me!
So I "invented" function objects.

Objects

I swear to god I have not heard about function objects back then. It wasn't until this year reading Bloch's Efficient Java where he talks about strategy objects. I immediately recognized my idea. So now I had many classes, every one implementing execute method. And I had an array of these objects. Now I did a look-up and invocation on an object. Sweet. And it even worked. But sadly I dropped the project and went on to graphics. Learnt SDL and did Tic-Tac-Toe. And dreamed about doing a vector 3D engine(curves baby!). Which until this day I didn't try to implement. Maybe I'll try in near future. 

Enhanced by Zemanta

Monday, October 8, 2012

Making a programming language: Part 7b - using objects

Table of contentsWhole project on github

Something like EPIC FAIL occured to me and I published a post containing only half the content I intended to write. So I'm doing a part b.

My intended usage of objects is something along the lines of
objectName.someProperty
objectName.someFunction()
someFunction().someProperty
someObject.someProperty.someFunction().someProperty.someFunction
Explanation

  1. getting a value from an object
  2. invoking a function contained in an object
  3. getting a value from returned object of the invoked function
  4. a bit contrived example. Invoking a function contained inside a property(object) of an object and then getting a function value from a property of the returned value from the first function. That's a mouthful, just read the damn code instead

Dot access

So everything bases on those little dots. First my thoughts were something like "you just do expr <- expr | expr.expr". This is just wrong. At least I should have reversed the order as this leads to infineite left recursion. Then I might have got away. Then I realized I only need dots after function calls and simple identifiers. Design choice(if you think it's a bad one leave a comment). Notice the "simple identifier". That's what I did: Renamed identifier to simple identifier and put something that handles dots under name identifier. And then fixed everything. 
case class DotAccess(lst: List[Expression]) extends Expression

private def identifier: Parser[DotAccess] = 
  rep1sep((functionCall | simpleIdentifier), ".") ^^ DotAccess.apply
That's about it. At least for parsing. Now the fun begins.

Nesting scopes

Scopes were designed with nesting in mind. This is a double edged sword. See, the "privates" can be  done if you rely on not being able to access the parent scope. If dot access exposes full addressing functionality a powerful feature ceases to exist. So some protection should me in place. Something like strict get
class SScope ...
 def getStrict(key: String): Option[Any] = map.get(key)
...
And I also added an unlinked view to it just to ease usage. This is just a method that returns new SScope with no parent overriding getters and put to use map available in closure.
So now I can walk down the list in DotAccess recursively and explicitly override the implicit scope parameter. And everything automagically works. Well, not quite. If you have a function call, the arguments need to be evaluated in top scope. Not in the nested one like the function identifier. At first I didn't even think about this and only failing attempts at more complex recursion brought up this quite obvious bug.
So how to solve this? I could pre-evaluate all arguments, but I use recursion to do this and it's two levels(at least) deeper from where dots happen. So no go. I need to carry on the outer scope. I overloaded the apply method from Evaluator so other code can still function(tests ftw!) and all in all it looks like this:
def apply(e: List[Expression])(implicit scope: SScope): 
    Any = {
    (e map apply).lastOption match {
      case Some(a) => a
      case None => ()
    }
  }

  def apply(e: Expression)(implicit scope: SScope): Any = apply(e, None)(scope)

  def apply(e: Expression, auxScope: Option[SScope])(implicit scope: SScope):     Any = e match {
    ...
    case DotAccess(list) => {
      val outerScope = scope
      def step(list: List[Expression])(implicit scope: SScope): Any =
        list match {
        case Nil =>        
          throw new ScratInvalidTokenError("got empty list in DotAccess")
        case elem :: Nil => apply(elem, Some(scope))(outerScope)
        case head :: tail => apply(head) match {
          case s: SScope => step(tail)(s.unlinked)
          case other => 
            throw new ScratInvalidTypeError("expected scope, got " + other)
        }
      }
      step(list)
  }
}
So an optional aux scope is the answer. It doesn't seem pretty to me, but it does the job. 


Enhanced by Zemanta