Sunday, February 22, 2009

Takeaways from chapter 1 and 2 of SICP

I worked through the classic cs book SICP a while ago and today I passed through the 1st and 2nd chapters of it. After taking this pass I asked to myself what are the most important learnings that I should keep in mind from these two chapters, this blog is the result of the answer to that..

Chapter-1:
When learning a new language, the things to look for are..
1. Primitive Expressions : primitive elements of the language.
2. Means of Combination : means provided by the language to combine the primitive elements and build complex compounds elements.
3. Means of Abstration : means provided by the language to give an identity to the compound elements made.

I want to do one more addition to this list that comes from another great book PoP .
4. Idioms : conventional ways, that experienced programmers of that language write, for various common pieces of code.

Think Recursively but code recursive procedure that produce iterative processes using state variables or to be precise, CTM calls it "accumulator programming".

Chapter-2:
Abstraction ! Abstraction !! Abstraction !!!
The engine(or work-horse or algorithm) part of the program should work with abstract objects without worrying about how they will be represented.
User of the (data)abstraction should have absolutely no knowledge of its actual implementation or in other words should be completely independent of its representation. This can be achieved by defining the "abstraction barrier" aka "interface methods" aka "constructors and selectors" . This helps build modularity in the system.

Wishful Thinking!
A powerful strategy to think abstractions. When building something, keep on assuming the solutions to subproblems available and use these imaginary solutions. Once the big problem is solved, you can focus on subproblems. Thinking in this way automatically brings the much needed abstraction in the system.
Another place where it helps is in "thinking recursively". For example when you want to find out the factorial of n, assume(wish) someone kind has given you the factorial of n-1 and now figure out what it takes to compute factorial n from that of n-1 . This is a very very powerful technique.

Closure!
Its even better if the "glue" that we use for combining simple object to build compound objects can combine the newly built compound objects also. This property is called "closure" . If satisfied, this can allow you to create arbitrarily complex objects with minimal efforts.

Multiple representation of Abstract Data:
Sometimes having just one abstraction barrier is not enough. Let say you have multiple representations of the data(for example complex numbers can have polar as well as rectangular form) and so in order to build a system that can simultaneously use any representation, we need to add another abstration barrier aka higher abstraction.
One technique to do it is "type dispatching", where you(generic selector) check the type of the datum and call the appropriate operation that can handle that particular data. This has a weakness that in order to add another data representation, all the generic selectors needs to be modified. The two techniques that allow you to build such systems and still keep the system modular are "data directed programming" and "message passing".
In data directed programming, we can use a operation-type table, at runtime we can lookup appropriate procedure for a type from this table and apply it. Addition of new data representation just needs to put its entries in this table.
In message passing style, data object is intelligent and representated by a procedure that can return various components of the compund object by passing messages to it.