Somewhere between the laborious and overcomplicated world of RUP and UML, and the seat-of-the-pants just-hack-the-code approach there has to lie a sweet spot in project management methodology. Well, Issue Driven Project Management is one of the contenders for moderation, somewhere between the stifling bureaucracy of RUP and the kiddie-script amateurism of making it up as you go. It's a technique whereby you orient your project around solving issues, which can be defects or feature additions, each of which should take no more than one or two work days to complete. Each issue is created, then either assigned to an individual or taken up voluntarily by an individual, and then is considered accepted. From then on, its status can be changed to fixed, invalid, duplicate, etc. In other words, each issue has its own life-cycle. The life-cycle stages of an issue can even be tied to the version control system by using appropriate tags on the commit messages. The issue list is best views as a matrix where the rows are the project contributors and the columns, the issues, and various project management tools can be massages to provide this perspective, for example the google code hosting service interface.
My first experience in IDPM took place last week on two week long project with 3 total project members. Between the continuous integration process we were using, a Jenkins build getting its code updates from the google code hosting site, and the IDPM, we were able to communicate fairly effectively with each other. Just by glancing that the IDPM matrix, I could see whether there were any tasks assigned to me or needing an assignee, and what the progress of my teammates was toward completing the project, and their current accepted issues.
IDPM's break-up of a project into small tasks is a nice way to divide a project into chunks so you can always see the end of the tunnel. It also avoids the tendency to optimize prematurely, make the architecture more general than it need to be for the current task at hand, or add luxury features even before the basic application is working.
In a sense, IDPM is like test driven development at a larger scale: first pick a task, then get it working, then pick another, and so on. It's all about incrementing functionality, not incrementing architecture or the size of the code base. This is a very pragmatic approach and leads to less stress than the attrition-based coding model, where you just code until there's nothing left to code. That model can result in lost productivity, where the major architecture undergoes several changes and redo's just because the developer doesn't yet have any features working, and thus doesn't have a concrete idea of what the architecture needs to provide, just an inferred one based on a design that hasn't been translated to code.
In essence, IDMP is an organic way to grow software, and every evolutionary change in the software is driven by some meed to satisfy a feature. This works well, and may not be the only good technique between the extremes of RUP and just-coding, but shows how well a moderate approach performs even in software development.
Monday, November 28, 2011
Tuesday, November 8, 2011
Learning the ropes with WattDepot
Here at UH, there's a url at which you can just about see how much energy each floor of several for the university dorms is consuming. As part of a federally funded study, the university has outfitted the energy meters and submeters in several of the dorms with power, voltage, and energy consumption sensors, that take readings at the sub-minute level and relay them back to a central server. By visiting a url at hat central server, you can pretty much see what energy consumption looks like for college dorm students.
The university's computer science department has designed and implemented a framework for energy data sharing, uniform access, storage, dissemination, analysis, and visualization, called WattDepot. The API is hosted on the Google Projects site, and once you download it, you can have some command line interactions with the server working in just a few minutes. WattDepot provide a nice API designed to hide the lower layer protocols, that makes interacting with the server much like interacting with a file on your local machine.
First I implemented a class called SourceListing that just listed the sensors associated with a particular WattDepot server, which is known as the "owner" of those sensors. This was pretty simple and pretty much spelled out on the WattDepot documentation examples on the Google project page. THis took me about 15 minutes to write the code and another 15 minutes to set up my Eclipse IDE to associate the framework source and javadocs with the library jar file, for easier editing, code completion, and javadoc perusal.
Then I wrote a SourceLatency class that sorted all the sources at the same url by latency, and this invloved using the code I already had and writing an anonymous inner class to do the comparison of latencies by implementing the Comparable interface. This took me about 20 minutes.
After that, I wrote a SourceHierarchy class that uses the subsources attribute of each source, spelling out which sources are its "children", to construct a set of trees of sources, and then print out those trees recursively using indentation to visualize the hierarchy much like files are shown in a file system browser, or on the command line using the Unix "tree" command. This took me about a half hour, since it involved formulating a game plan to build the trees in the most economic way. What I did was to simply find out what the roots were, by going through all the sources and eliminating all children as root candidates. Then I printed the trees at those roots recursively, so there's never an explicit actual tree data structure in RAM, but I'm able to print out the tree structure nonetheless.
After that, I was energized so to say, and I proceeded to writing a class called EnergyYesterday. Here started the pain. I had to find out how to have Java give me yesterday's date, so as not to hard code it, and I had to find out how to translate between various date formats: XMLGregorianCalendar which WattDepot uses, java.util.Date and java.util.Calendar. Well, let's just say I found some code online that braves this tedious translation, I put in in a class, duly attributing it of course, and doing date calculations should be much easier from here going forward. This class took me maybe an hour and a half to write, much of it having been spent wrestling with dates and date formatting, cleaning up useful date conversion code I found on the web, and making utility classes out of it. THis task would have been garder had it not been for a simple WattDepot API function called getEnergyConsumed that takes two timestamps and provides the total energy consumed between them, so no adding or aggregating was necessary in my part. I just had to loop through all the sources at
After that, I did some energy analysis in a class called HighestRecordedPowerYesterday, which uses an API call by the name of getPowerConsumed, but here I had to loop through each sensor's data points between the starting and ending timestamp and keep track of the maximum power consumed and its related timestamp. This took me about 45 minutes, and I borrowed much of the code from the WattDepot wiki documentation on the Google project site.
Finally, in a class called MondayAverageEnergy, I wrote code to average the energy consumed at each sensor for this past Monday and the one before it. Here I hard codes the dates, and the code is a minor extension of EnergyYesterday. This took me about a half hour.
Now that I accomplished these tasks, I feel like I have a decent grip on the WattDepot client API, and the workflow for getting data from the WattDepot servers, but I have a feeling there's much more to the WattDepot framework, and I'll definitely be wring more about that in the next few weeks.
The university's computer science department has designed and implemented a framework for energy data sharing, uniform access, storage, dissemination, analysis, and visualization, called WattDepot. The API is hosted on the Google Projects site, and once you download it, you can have some command line interactions with the server working in just a few minutes. WattDepot provide a nice API designed to hide the lower layer protocols, that makes interacting with the server much like interacting with a file on your local machine.
First I implemented a class called SourceListing that just listed the sensors associated with a particular WattDepot server, which is known as the "owner" of those sensors. This was pretty simple and pretty much spelled out on the WattDepot documentation examples on the Google project page. THis took me about 15 minutes to write the code and another 15 minutes to set up my Eclipse IDE to associate the framework source and javadocs with the library jar file, for easier editing, code completion, and javadoc perusal.
Then I wrote a SourceLatency class that sorted all the sources at the same url by latency, and this invloved using the code I already had and writing an anonymous inner class to do the comparison of latencies by implementing the Comparable interface. This took me about 20 minutes.
After that, I wrote a SourceHierarchy class that uses the subsources attribute of each source, spelling out which sources are its "children", to construct a set of trees of sources, and then print out those trees recursively using indentation to visualize the hierarchy much like files are shown in a file system browser, or on the command line using the Unix "tree" command. This took me about a half hour, since it involved formulating a game plan to build the trees in the most economic way. What I did was to simply find out what the roots were, by going through all the sources and eliminating all children as root candidates. Then I printed the trees at those roots recursively, so there's never an explicit actual tree data structure in RAM, but I'm able to print out the tree structure nonetheless.
After that, I was energized so to say, and I proceeded to writing a class called EnergyYesterday. Here started the pain. I had to find out how to have Java give me yesterday's date, so as not to hard code it, and I had to find out how to translate between various date formats: XMLGregorianCalendar which WattDepot uses, java.util.Date and java.util.Calendar. Well, let's just say I found some code online that braves this tedious translation, I put in in a class, duly attributing it of course, and doing date calculations should be much easier from here going forward. This class took me maybe an hour and a half to write, much of it having been spent wrestling with dates and date formatting, cleaning up useful date conversion code I found on the web, and making utility classes out of it. THis task would have been garder had it not been for a simple WattDepot API function called getEnergyConsumed that takes two timestamps and provides the total energy consumed between them, so no adding or aggregating was necessary in my part. I just had to loop through all the sources at
After that, I did some energy analysis in a class called HighestRecordedPowerYesterday, which uses an API call by the name of getPowerConsumed, but here I had to loop through each sensor's data points between the starting and ending timestamp and keep track of the maximum power consumed and its related timestamp. This took me about 45 minutes, and I borrowed much of the code from the WattDepot wiki documentation on the Google project site.
Finally, in a class called MondayAverageEnergy, I wrote code to average the energy consumed at each sensor for this past Monday and the one before it. Here I hard codes the dates, and the code is a minor extension of EnergyYesterday. This took me about a half hour.
Now that I accomplished these tasks, I feel like I have a decent grip on the WattDepot client API, and the workflow for getting data from the WattDepot servers, but I have a feeling there's much more to the WattDepot framework, and I'll definitely be wring more about that in the next few weeks.
Tuesday, November 1, 2011
Energy in Hawaii
Hawaii presents a set of of unique challenges and on the other hand fertile opportunities in the areas of green energy production and energy conservation. Because Hawaii is isolated, it is not part of a supergrid like the contiguous US states are, and therefore has to import its energy from further away, and is more susceptible to temporary supply/demand imbalances. On the other hand, because of its unique location, year-round sunny climate, and large number of shoreline miles, Hawaii is an ideal candidate for the early success of wind, sea, and sun, energy solutions.
Some of these solutions aren't yet cost-effective on the mainland due to the easy exploitation of cheap fossil fuels or the technological and infrastructure gap between fossil fuel production and green energy production. However, they are actually already cost-effective in Hawaii, despite the same technological gap, because Hawaii pays a surcharge on its electrical energy costs, due to transportation and isolation, that makes electrical energy 2-3 times more expensive than on the mainland. Unlike the mainland, Hawaii derives most of its electrical energy from oil, which is the most expensive fossil fuel.
Because of this unique economic situation, green energy is being pursued in Hawaii in a more aggressive way than in most mainland states, even by Republican administrations such as that of former governor Linda Lingle, who signed the Hawaii Clean Energy initiative, thus going against the grain of a Republican party that is largely skeptical of global warming, even to the point that many Republicans view global warming as a liberal scheme to stifle capitalism.
However, green energy production and use, does pose a few challenges. One of them is that green energy is typically intermittent. For example, a solar panel doesn't produce energy at night, and a windmill doesn't produce energy when there is no wind. So, to make the best use of these intermittent and time-varying energy sources, we need to integrate them within our current energy grid in a cohesive and efficient manner.
This means that two-way communication *and* control, need to take place between the energy consumer's home and the the electric power plant, and between the devices in the consumer's home. This would allow, for example, an electrical plant facing a demand spike to temporarily shut down the air conditioning system of some of their clients homes alternately for a few minutes to allow the smoothing out of the spike. It would also allow a home's air conditioning system to use more energy when that home's solar panels are producing more energy, thus changing the temperature activation set-point of the air-conditioner, to align itself with the temporarily greater energy being produced by the solar panels.
Of course, all this information communication and control, requires a a decent amount of hardware and even more software, as we strive to make the devices smarter and as we get new ideas for how to program and reprogram them. So, intelligent and robust software has a definite role to play in a green energy infrastructure. Beyond that, it may have an even bigger role to play in green energy research and calibration, as we need a way to visualize information in a decentralized manner, so that information from many places can be aggregated and viewed on one terminal. Even when deploying proven green energy solutions, an individual household will still want to adjust the parameters of the programs driving their energy saving devices, to optimize their individual household energy savings and target them at their specific needs, and this too requires software for input, validation, and communication.
So, it's nice to know that software is going to play an intricate role in the evolution of one of Hawaii's most exciting technology and research areas, and that getting the software right will save us all time, energy, and money.
Some of these solutions aren't yet cost-effective on the mainland due to the easy exploitation of cheap fossil fuels or the technological and infrastructure gap between fossil fuel production and green energy production. However, they are actually already cost-effective in Hawaii, despite the same technological gap, because Hawaii pays a surcharge on its electrical energy costs, due to transportation and isolation, that makes electrical energy 2-3 times more expensive than on the mainland. Unlike the mainland, Hawaii derives most of its electrical energy from oil, which is the most expensive fossil fuel.
Because of this unique economic situation, green energy is being pursued in Hawaii in a more aggressive way than in most mainland states, even by Republican administrations such as that of former governor Linda Lingle, who signed the Hawaii Clean Energy initiative, thus going against the grain of a Republican party that is largely skeptical of global warming, even to the point that many Republicans view global warming as a liberal scheme to stifle capitalism.
However, green energy production and use, does pose a few challenges. One of them is that green energy is typically intermittent. For example, a solar panel doesn't produce energy at night, and a windmill doesn't produce energy when there is no wind. So, to make the best use of these intermittent and time-varying energy sources, we need to integrate them within our current energy grid in a cohesive and efficient manner.
This means that two-way communication *and* control, need to take place between the energy consumer's home and the the electric power plant, and between the devices in the consumer's home. This would allow, for example, an electrical plant facing a demand spike to temporarily shut down the air conditioning system of some of their clients homes alternately for a few minutes to allow the smoothing out of the spike. It would also allow a home's air conditioning system to use more energy when that home's solar panels are producing more energy, thus changing the temperature activation set-point of the air-conditioner, to align itself with the temporarily greater energy being produced by the solar panels.
Of course, all this information communication and control, requires a a decent amount of hardware and even more software, as we strive to make the devices smarter and as we get new ideas for how to program and reprogram them. So, intelligent and robust software has a definite role to play in a green energy infrastructure. Beyond that, it may have an even bigger role to play in green energy research and calibration, as we need a way to visualize information in a decentralized manner, so that information from many places can be aggregated and viewed on one terminal. Even when deploying proven green energy solutions, an individual household will still want to adjust the parameters of the programs driving their energy saving devices, to optimize their individual household energy savings and target them at their specific needs, and this too requires software for input, validation, and communication.
So, it's nice to know that software is going to play an intricate role in the evolution of one of Hawaii's most exciting technology and research areas, and that getting the software right will save us all time, energy, and money.
Subscribe to:
Posts (Atom)