- Improve our original implementation based on the feedback in the code review.
- Install and use Hackystat and SoftwareICU to get a visualization of our group process.
- Implement new commands.
- Answer a few Watt Depot questions (task B) from Philip.
The second step was to use Hackystat and SoftwareICU to visualize our group process. The setup was fairly tricky, but we eventually got it to work. I did update the Hackystat library and forgot to tell Yichi, but we resolved that fairly quickly.
The third step was to implement the new commands. This was pretty straightforward. I implemented one command while Yichi implemented two. I changed a few things in Yichi's code (I had helped a classmate work on one of the commands that Yichi was doing), but it was good for the most part.
Before I get to the questions posed by Philip, overall I'd say I was pretty pleased with the way the project went. Our design is pretty solid and our test coverage is 91% (a mere 1-2% increase over 1.0). There are some minor issues that were reported by the reviewers that we did not quite get to. The help refactoring meant that we could provide better error messages, but as of this writing we did not redo all of the commands.
Yichi and I did not meet regularly. We mainly communicated via email, which I think worked for us. I might've spent more time doing refactoring, but I guess I had the design in mind that I wanted to impelement. However, Yichi did implement 2 commands versus my one, so I guess the work was equally partitioned even if the time spent wasn't.
I'm not sure how I feel about the SoftwareICU. On the one hand, it's neat to visualize our project health. However, this revision was not a good time to get introduced to the SoftwareICU mostly because some aspects felt out of our control. For instance, many people had to refactor and/or rewrite parts of their code based on reviews. Thus, the churn reading on the SoftwareICU will probably be high for most of us. As for complexity, many students may have started out with a poor implementation and will be stuck with a high complexity reading unless they rewrite most of their code (although, I guess the implementation wasn't too complex yet). It'll be interesting to use the SoftwareICU on our next assignment, since we'll be starting with a clean slate.
Robert had mentioned to us that the energy consumed by SIM_OAHU_GRID is zero for all dates since the data only represents power plants. Instead, energystats calculates the energy generated by the grid. Getting the hourly data takes a while on my internet connection, probably because it's a lot of data to gather. I split it up into 10 day increments to hopefully make a little more reliable.
>energystats generated SIM_OAHU_GRID from 2009-11-01 to 2009-11-10 hourly
Max: 951.0 MW at 2009-11-02T20:00:00.000-10:00
Min: 497.5 MW at 2009-11-02T04:00:00.000-10:00
Average: 606.3 MW
>energystats generated SIM_OAHU_GRID from 2009-11-11 to 2009-11-20 hourly
Max: 951.0 MWh at 2009-11-11T20:00:00.000-10:00
Min: 497.5 MWh at 2009-11-11T04:00:00.000-10:00
Average: 609.4 MWh
>energystats generated SIM_OAHU_GRID from 2009-11-21 to 2009-11-30 hourly
Max: 951.0 MWh at 2009-11-23T20:00:00.000-10:00
Min: 497.5 MWh at 2009-11-23T04:00:00.000-10:00
Average: 603.1 MWh
So there seems to be at least a 3 way tie for max and min, although it might happen more regularly than that. Let's look at the daily data.
>energystats generated SIM_OAHU_GRID from 2009-11-01 to 2009-11-30 daily
Max: 14764.0 MWh at 2009-11-03T00:00:00.000-10:00
Min: 14089.0 MWh at 2009-11-02T00:00:00.000-10:00
Average: 14571.1 MWh
If only you could see me twiddle my thumbs while these commands do their thing. Finally, here's the carbon emitted statistics.
>carbonstats SIM_OAHU_GRID from 2009-11-01 to 2009-11-30
Max: 29959472.0 lbs at 2009-11-04T00:00:00.000-10:00
Min: 22908808.0 lbs at 2009-11-07T00:00:00.000-10:00
Average: 27141379.3 lbs
And that wraps it up for the command line client. We're implementing a web app in the coming weeks, so stay tuned!