With a hot and sunny August, Texas set a new record in 2015 for peak demand in electricity at 69,621 MW. For comparison on your home electricity bill that is 69,621,000 kW in one hour. These figures are for ERCOT which manages reliability and transmissions for the majority of Texas electricity demand. ERCOT’s job of managing the grid is becoming increasingly complex with the rise of renewable power.
During 2000 wind generation was negligible but by 2014 had grown to 14,100 MW of installed capacity. That wind power generated nearly 10.6% of electricity making Texas the top state for wind power. Wind capacity is set to grow to nearly 20,000 MW of capacity and will overtake Nuclear to become Texas’s third largest source of energy. The growth of wind has been a trend for several years and is the result of billions invested in transmission lines.
One of my largest personal projects was the creation of a weather prediction database. I was looking to store 7 key metrics that could be used in the future to power Android and IPhone fitness based apps. The full program can be found on my GitHub here. I chose the National Weather Service’s NDFD database as my source since it had a free SOAP web service.
In the United States there are over 43,000 zip codes and the National Weather Service maintains weather prediction data for each of them. The broad goal of this project was to gather this data and place it into a local MySQL database where it would be easier to access.
If you have travelled downtown or to a central neighborhood in your city recently you may have been surprised by how much construction you saw. Cities like Austin, Portland, and Denver are growing at an accelerated clip since the Great Recession ended six years ago. The suburbs are still growing and more sprawling than ever across the nation but what’s different from the 70s through 90s is that walkable neighborhoods in cities’ central cores are also growing.
Jeff Speck is an Urban Designer and most people probably don’t know what an Urban Designer does. That’s not surprising because it is a very small field that doesn't get very much attention.
In the previous blog post I went over how the front end displays a chart mapping the % CPU usage of the server running this website. Now I’ll continue by explaining the hidden side of things on the back end. The purpose of the back end is to generate this text file containing the time and % CPU. There are 4 main pieces that make this work starting with the Linux command mpstat.
My website's hosting server is a virtual private server deployed on Digital Ocean’s cloud. Digital Ocean has a feature for its users to be able to view their server’s CPU usage on a graph. When I first booted up a Digital Ocean server I tried a few things to get this chart to show some activity. The servers I run use the Linux operating system Ubuntu but Ubuntu by itself has a baseline CPU usage somewhere close to 0%. After I uploaded a few old Java programs that had to do some decent calculations I was finally able to move the CPU usage around a little.
More recently I decided to try and recreate a simpler version of this chart on my website.