CS 677 Distributed Operating Systems

Spring 2011

Programming Assignment 2: Asterix and the Trading Post

Due:Friday, April 1 2011, 5pm



  • A: The problem


  • B. Evaluation and Measurement

    1. Deploy at least 6 peers. They can be setup on the same machine (different directories) or different machines.
    2. Do a simple experiment study to evaluate the behavior of your system. Compute the average response time per client search request by measuring the response time seen by a client for, say, 1000 sequential requests. Also, observe the performance change due to the coordinator re-election. Make necessary plots to support your conclusions.
    3. Next run some of your peers on a remote Ec2 servers and the rest on local machine(s). Force your leader election to choose a peer on EC2 as the coordinator (this can be done, for instance, by running a few peers on EC2 and giving them higher number IDs - which will cause leader election to choose an EC2 peer as the coordinator). Repeat the experiment and comment on how your measurements change due to WAN latencies between some of the peers. Instructions on accessing EC2 servers to run your experiments are here


  • C. What you will submit

  • When you have finished implementing the complete assignment as described above, you will submit your solution in the form of a zip file that you will upload into SPARK.
  • Each program must work correctly and be documented. The zip file you upload to SPARK should contain:
    1. A copy of the output generated by running your program. When it receives a product, have your program print a message "bought product_name from peerID". When a peer issues a query, having your program print the returned results in a nicely formatted manner including the local time that the result is received. When a new coordinator is elected, print a message like " Dear buyers and sellers, My ID is ..., and I am the new coordinator".
    2. A seperate document/file of approximately two pages describing the overall program design, a description of "how it works", and design tradeoffs considered and made. Also describe possible improvements and extensions to your program (and sketch how they might be made).
    3. A program listing containing in-line documentation.
    4. A seperate description of the tests you ran on your program to convince yourself that it is indeed correct. Also describe any cases for which your program is known not to work correctly.
    5. Performance results.
    6. A readme file about how to run your code.

  • D. Grading policy for all programming assignments

    1. Program Listing
        works correctly ------------- 50%
        in-line documentation -------- 15%
    2. Design Document
        quality of design ------------ 15%
        understandability of doc ------- 10%
    3. Thoroughness of test cases ---------- 10%
    4. Grades for late programs will be lowered 12 points per day late.