Wednesday, November 20, 2013

Exoplanet Light-Curves with OSCAAR - Photometry for Everyone!

For our astrophysics club we have been doing some observations on Faulkes North over on Maui and managed to capture half of a transit before the weather shut down the telescope. At this observation session, we went through some tutorials on generating light-curves and discussed the general methodology (read about our Observation Observations) but I had wanted to find a more robust, and ideally easier, way for members to be analyzing the data in the future. After flirting with the idea of writing a comprehensive program I got some better sense and went looking for someone else who already had. Lo and behold, OSCAAR.

OSCAAR is a simple-to-use GUI that assists in the development of light-curves for exoplanets (and other eclipsing events). In their words, "oscaar is an open source project aimed at helping you begin to analyze observations of transiting extrasolar planets with differential photometry," and, "oscaar is useful for observers at small and large observatories alike, and users with any level of coding experience. No coding experience or PhD in astrophysics necessary!" Or, like the title says, "Photometry for Everyone!"

However, since the software is still in development (2.0-beta at time of writing) there are inherently some problems and I very quickly encountered these. Here are some initial notes about using and solving these problems, which will continue on into the future.


  • Initially the program required dark frames and flats. I created an Issue regarding this limitation, to which the authors immediately started working on a fix. Within a week or so this was adjusted to create a dummy dark and flat image (basically an array of same size as image consisting of 1s and 0s, respectively). The interface could probably use some work and will be contacting authors regarding this issue but overall I was very please and happy with the quick response. Thanks!
  • Region file must be in coordinate system (format: saoimage) or dies with bad error. This was not obvious nor the DS9 default so there was some confusion and digging on my part.
  • Exposure time is set in header as ‘DATE-OBS’, which should be selected (?) automatically. Had to go into Extra Observatory Parameters and select ‘DATE-OBS’ (which was the only option given) for it to work.
  • If selecting Tracking Plots and/or Photometry Plots, plots are given as running and seems to work well (see pretty pictures).
    Tracking and Photometry plots being generated while running.
      
  • When trying to do a fit a bunch of array output is given and then an error. Digging into it, it is a dimension error. Something about the comparison stars when doing a np.dot versus what is passed into the leastsq. Digging into the problem, it seems to be related to the issue seen here, with leastsq not knowing how to properly optimize a multi-dimensional array.
    • Well, after digging around for a while, it turns out that I just needed to use less comparison stars. I’m not sure if this is a bug or designed that way and will contact authors. Documentation refers to using as few as possible in order to increase speed but seems like an error should be handled a little better.


  • Half a light curve with fit for TrES 3-b
    After processing, I see a graph displaying my light-curve (well, I only have half a curve). The dialog box for the Dark and Flat are still displayed, which is a little confusing, it should close earlier. Then I am taken to a box asking for my result file names with a button for an MCMC Fit. I’m just accepting the default values for all the parameters, of which there are a lot.
  • This gives me an error finding analyticalTransitModel.so, which is a linked library, so might just be a problem with me running from the repo directory rather than installing.




  • Trying again the next day, I used the previously installed copy of OSCAAR (instead of the version from the repo) and had no problems with the analyticalTransitModel.so. I was able to download a copy of the exoplanets database (by entering Planet Name and choosing Update Parameters). This ran for 100 iterations, looking fancy and science-like, but then gave me an AssertionError saying that there was not successful optimization and to try new Beta values. Certainly, the error reporting could be a lot more user friendly. It suggests updating the Beta parameters but I really have no idea how to adjust them to get more (or any) appropriate results.

MCMC fit and plot. This is in the middle of the process which eventually errors out.


All in all I am impressed with the work done so far and look forward to working with the authors to help improve OSCAAR. I think there is some general clean-up work that could be done with the GUI (I notice there is a branch for a new GUI) in terms of using as well as when giving errors. I would also like to see a python3 version. The website talks about the dependencies but I think the only thing not ported to python3 directly is PyFITS and that has been rolled into astropy and should be working well so perhaps the port to python3 could happen soon (yes, that is sort of me volunteering).

Sunday, October 13, 2013

International Observe the Moon Night

Stand about, look, let your eyes see the moon.
Had a good night out at the VIS checking out the moon. The University Astrophysics Club at UH-Hilo was up doing some LADEE Lunar Impact Monitoring for the Pacific International Space Center for Exploration Systems (PISCES). InOMN encourages everyone to take notice of our nearest celestial neighbor by simply going outside and looking up!

Sunday, July 21, 2013

2013 Lunar Science Virtual Forum Overview

You can't sequester science.

At least that's the idea coming out of the 2013 Lunar Science Forum from the  NASA Lunar Science Institute (NLSI). The 6th annual forum saw a number of changes, starting with a (budget-enforced) switch to being a completely virtual forum all the way to the announcement that the Lunar Science Institute is now the Solar System Exploration Research Virtual Institute (SSERVI). One can extrapolate from the organizational name change that the focus will not only be the Moon (think small-bodies like asteroids) but will also be virtual from here on out. My thoughts are that this is a good thing as the NLSI really seems to be trying to bridge the work between scientists, universities, and governmental organizations. There were a few questions throughout the forum asking about co-operation with the private sector although that discussion was, perhaps surprisingly, limited.

The SSERVI crew posing after a well run #LSF13
The SSERVI crew posing after a well-run virtual forum.
This being an all-virtual forum there were naturally some challenges and issues. Overall, the folks at SSERVI (I believe most of them were at AMES) did a really great job with most of the technical issues while the problems seemed to stem from participants who failed to pay attention to instructions. The agenda includes links to the entire archived conference, which can play back in real-time including all the chat features and user blunders. It's kind of cool actually. Definitely one of the advantages to an all virtual conference is that things like a complete archive of all the session is readily attained. Indeed, you can even check me out in the Lightning Talks here. :)

One can easily garner the gist of current research and focus by looking over the agenda titles with lunar geology dominating most of the sessions (naturally), although there was a fair sprinkling of volatiles, exospheres, human exploration, heliophysics and other misc. items scattered throughout. The individual speakers ranged from extremely fascinating to hard-to-understand-for-various-reasons to dry-n-pedantic and the variation was largely independent of the subject matter.

The basic format was to have one or two general speakers do an introduction of sorts to the subject matter for the day. These were largely interesting and well done, giving brief overviews of the relevant issues and subject matter. For instance, the second day opened up with some quick introductions to geology and volatiles as they are present on the moon. Dana Hurley, of the John Hopkins Applied Physics Lab, began with an insight about how when she started in the field 15 years ago the questions would have been about whether or not water existed on the Moon, a question we now have thoroughly answered. The questions now, she asks, are: 1) What are the present day abundances of volatiles and 2) what is the actual composition of these volatiles? You can see her concise overview talk here but it seemed to reflect the general consensus that we have learned a whole lot recently and are about to learn a whole lot more.

Throughout the forum (I keep wanting to say conference) there was a lot of talk about LADEE, which is natural as it is about to launch (window opens Sept. 6 - see launch details) and will give a lot of science to most of the people attending the forum. It's interesting to see all the different components with LADEE and I can't seem to shake the feeling that everyone is trying to simultaneously pat each other on the back as well as up-sale the LADEE mission in an effort to convince people it is worthwhile. Actually, this is more the feeling I get out of everything that comes from NASA these days, which makes sense as they face large budget cuts and a public that is skeptical of spending money on things they don't understand. Including the need for lunar forums.

On a personal end, and as mentioned, I did participate in the Lightning Talks, with Krystal giving a good overview of the hard work she has been doing that I have been helping volunteer with. I also gave a quick summary of some of my plans to create an easy to use data acquisition black-box kind of thing for the lunar impact monitoring (details later). Overall, it was nice to get the (brief) exposure and a good chance to tout both PISCES and UH-Hilo.

SSERVI's concluding note highlights a few areas of interest, including the awards ceremonies, and links to the relevant areas. Below are a few talks that I thought were cool, in no particular order:

Sunday, July 7, 2013

Imiloa Talk - Observing the Center of the Milky Way at 45,000 feet

Having lived in Hilo for the last couple of years, and on the Big Island for over four years now, it is a bit saddening that I have never bothered to become a member of the Imiloa Astronomy Center nor really attend many of the awesome events they regularly put on. Today, however, I had the opportunity to change that by attending an interesting talk by Ryan Lau, a graduate student out of Cornell University and a native Hawaiian resident, discussing his graduate work with infrared astronomy aboard NASA's SOFIA, a modified 747 designed to operate at 45,000 feet.

One of SOFIA's recent flight plans
Lau did a really great job of giving a brief overview of his work and of infrared astronomy in general, opening with an anecdote about his realization that the Sagan Walk, which originates at Cornell University, has the final piece of installation - representing the nearest star to our sun, Alpha Centauri - right here at the Imiloa Center, a project just completed this last December. As I reflect on it afterward, it is interesting to me that his diagram of the Sagan Walk, with the diagram arrows stretching nearly 8000km across the United States, ended up looking similar in spirit to his later slide depicting the flight paths taken by SOFIA as it flies across the United States. The contrast, of course, being that while we have mastered the ability to fly ourselves easily across our own globe, the thought of travelling to our nearest star, let alone something like the galactic center, still remains as daunting of a concept as the flat world was to initial explorers.

Any talk that is given to the general public about a highly advanced and specialized subject necessarily skips over a lot of the details but I thought that Lau did a really good job both in his presenting style, his ability to answer questions, and in his overall format of the speech. As he highlighted in the beginning, he wanted listeners to come away from the hour with three key points:
  • Infrared is awesome.
  • Airborne astronomy is awesome.
  • The galactic center is awesome.
A look inside SOFIA. Lau is in bottom left.
With that in mind, Lau guided us through a brief but thorough overview of infrared astronomy, including the current ground and space based systems (R.I.P. Herschel) as well as the problems associated with each, with atmospheric conditions mostly affecting the former and prohibitive cost/effectiveness the latter. So onward to the second key point, that of airbone astronomy and SOFIA.

There is literature out there (follow the links above) describing SOFIA so I won't go into too much detail other than to say that it appears to be a pretty cool setup. Flight times are typically on the order of 10 hours, with nearly 5 hours of science accomplished on each flight. A typical session will last about 3 weeks, with 2-3 flights happening each week, for a total of 10-15 flights per session. The latest session, which Lau was on the plane for, had ended the previous week and had included some exciting new images of the galactic center, which is the thrust of Lau's work and his third key point.

The Galactic Center is Awesome
Specifically, Lau has been using the Faint Object Infrared Camera for the SOFIA Telescope (FORCAST - playing fast and loose with acronyms) to look at two main regions of the galactic center, that of the inner 10 light years and the gaseous torus of dust that surrounds the black hole there, as well as that of The Sickle, which is a dense region of recent massive star formation and the site of three identified Luminous Blue Variables, extremely bright stars that can undergo unpredictable and radical shifts in brightness and spectra. In addition to being extremely bright, these stars are also extremely rare, with only 20 of them having ever been confirmed. To have three of them in one area of focus is quite extraordinary and the latest pictures generated by Lau also showed some pretty exciting properties that have never been observed.

One of the really fascinating things for me when I see stuff like this is that the science is a work in progress. Lau had literally taken these images last week and revealed some pretty incredible and awesome pictures of the two focus areas - images which aren't released to the public yet - and is still in the active process of interpreting and analyzing the results. While I would love to spill the beans on this by talking about these cool properties, I'll go ahead and wait until Lau has published his work before highlighting anything. Needless to say I thought it was pretty cool.

As mentioned, Lau did a good job of explaining these concepts and some of his results and overall just did a really thorough and concise job of giving a run-down of his recent work. Thanks again to Ryan Lau and the Imiloa Astronomy Center for the opportunity.

Wednesday, May 1, 2013

Dr. Buzz Aldrin and the Aldrin Cycler

I attended a Space Studies seminar yesterday up on campus in which they were watching a private video of a speech given by Dr. Buzz Aldrin to a group of PISCES members and supporters during the annual PISCES conference. This last year featured Dr. Aldrin speaking at the Waikoloa resort about a number of ideas, but mostly focused on the ideas of a united space policy, across all sectors and countries, as well as a tour and discussion of the Aldrin Cycler, again with some political asides on the part of Buzz being Buzz.

Artist depiction of an Aldrin Cycler. via
Throughout the video Buzz seemed lively and excited, often pausing to collect his thoughts and really appear to be speaking with genuine interest about the subject. He remarked early on about how he was without a teleprompter and took this opportunity to make his first jab at the President. Most of his remarks were off the cuff and of good-natured humor but he clearly has some gripes with the administration, past and present, and the lack of continued interest in the space program in general and things like the Apollo Missions in particular.

The talk was split into two main areas, the first in which Dr. Aldrin spoke in general terms about his ideas for space exploration going forward. As he stated, the idea is to "explore commercially developed and permanently settled space" He started right out by citing three core ideas of any space policy going forward: 1) strong leadership, 2) a sustainable path, and  3) long-term ambitions. Interestingly enough, I thought Dr. Aldrin's discussion about strong leadership and a united space policy, while all around interesting and full of nothing that be considered negative, still seemed to be confused, almost pandering to both sides about what a space program should look like going forward. And what are those two sides Buzz is pandering to? On the one hand, he repeatedly used words like "united," "collaboration," "coming together," and "cooperation" while speaking of public, private, international, and, frankly, all sectors and peoples that will be interested in space in the coming century. And everything he calls for is sound and logical, as it is important that we have united in policy and ideas going forward.

Yet, on the other hand, most of his speech was essentially about how the USA can maintain "leadership" in the coming years, and about how we don't want to send astronauts to the Moon, "only to find the Chinese already there." Overall, his general idea was that while other countries are focused on returning humans to the lunar surface, we need to start concentrating on how to get to Mars, thus staying one step ahead of the competition, as it were. Interestingly, Buzz, having had his chance to leap about the lunar surface, does not advocate directly sending humans to Mars although that is definitely part of his long-term plan and a critical feature of the Aldrin Cycler. But, "if we're not sending people [to a planetary surface]  than how can we be leaders? We do it by knowing more than anybody else."

Here Buzz seemed to be spot on in his assessment, namely that we are going to fail, we are going to make mistakes, but we just need to start trying. We (here implying the USA) need to maintain our leadership with this ability to try and fail, while all the while learning.

The first half of the talk was filled with these kind of  practical orations. "Today, collaboration [international, public, government, state, private, etc.] is an essential element for our success." We must, "[take] into account all the other countries space policy objectives." "I'm going to talk to you about a unified space exploration."

PISCES was also given its requisite amount of fan-fare: "As the space faring community comes together to embark...PISCES will lead in a unique and critical role in this vital new enterprise in space." "Begin with PISCES and extend to cislunar space and beyond." "But even more important, PISCES is to lead the construction of the lunar base."  And so on.

The second half of the talk Buzz focused on the Aldrin, or Mars Cycler, and practical attempts and methods to actually start getting to Mars. Dr. Buzz Aldrin actually has his PhD in Orbital Mechanics, so it's not just some guy who has walked on the Moon speaking here, he does actually know what he is talking about. Again, the central theme here seemed to be that we just need to start doing it and stop being afraid. There was a fair amount of modestly technical information about the Cycler, which refers both to the synodic period as well as the actual spacecraft Buzz envisions. [Details can be found on  Buzz's site.]

There was actually a lot more to the talk and I have a good page of pithy quotes given by Buzz. All in
all, it was actually a very interesting talk and always good to see an 80-year-old Buzz Aldrin still just as excited about humans getting into space.

A few items to share:

  • On unity: Buzz wants to create a United Strategic and Space Enterprise foundation to foster this internal and global unity. That's right, it does have the name USS Enterprise, and this is, obviously, on purpose.
  • On outreach and the public: "We really need to get philosophers, [and] people that are historians, to really think and look at what the earth is capable of doing. It is capable of taking humans beings and putting them on another surface in our solar system."
  • On our legacy: "Thousands of years from now we will look back on the leader that committed a group of people to [reach the surface of Mars]."
  • And, "That's why they're going to be the pilgrims."

Thursday, April 18, 2013

Business Breakfast with PISCES

The Pacific International Space Center for Exploration Systems
is an international research and education center dedicated to
sustaining life on the moon and beyond. Recent agreements
have added new breath to the organization, fostering business
developments in Hawaii.
This morning I had the chance to attend a breakfast conversation organized by Hui Ka Ua with a special talk given by Rob Kelso, the new Executive Director of PISCES. Since I have been doing some work this semester with the LADEE project under the auspices of PISCES, this was a great opportunity to see what some of the future plans and developments are with PISCES, especially with regard to the Big Island. Indeed, since this was partly a talk organized by a business organization, much of the information that Kelso conveyed was focused on the local business developments that will arise as the result of the PISCES work now being done. Highlighting all of this was a press release from PISCES that was handed out before-hand which talks about the six new Memoranda of Understandings (MOUs) that have been signed with various governments and organizations throughout the world that will funnel jobs and technology back to Hawaii. Rob Kelso, from the press release:
Many of the initiatives involve robotics, construction material research, renewable energy and telecommunications. The benefit will not only be technologies for use on planetary surfaces, but also innovative technologies that can have immediate application for the Hawaii economy and the general well-being of the State.
The MOUs span a variety of technologies but all gravitate around the blossoming aerospace industry on Hawaii, including 3D printing technologies and other advanced fabrication techniques. Kelso didn't address any of these directly but spoke more in general terms about how Hawaii is very well situated for future international developments, pinpointing the centrality in the Pacific, the unique environmental conditions on Big Island (a lot of time spent here), the fact that there are two international airports, two deep-sea ports, and a variety of power generation mechanisms (think geothermal).

Interestingly, Kelso mentioned plans for such things as an aerospace technology park on Big Island, a multitude of conferences and educational drives, and all around exhibited a lot of energy about the project. Opening his talk, Kelso also mentioned how he was learning the etymologies of 'Haole' and 'Aloha' and talked about how he wanted to arrive (as a foreigner to the state) but wanted to be welcomed into the island and to really have PISCES be an organization that works with the peoples and future of Hawaii, rather than one that merely utilizes the resources. Very apt as an opening, as it is precisely a new and fresh breath that PISCES has been given ("the first MOU's [sic] in five years for PISCES") with his arrival. As someone who is trying to be involved with PISCES long-term, it is personally exciting to see some progress and change being made and it was great to be able to attend this morning session. Mahalo to Hui Ka Ua and PISCES for organizing the event.

Monday, November 12, 2012

Real-Time Graphing of Arduino Data (Python, matplotlib, and PyQt)

A simple LED controlled by a photocell. adafruit.com
This is just going to be a quick example of how to read some serial data off of the arduino and make a real-time plot of that data using python (python 3.2), matplotlib, and PyQt. Nothing really fancy going on here and nothing that hasn't been done before (check the Resources) but this does combine a few disparate ideas and does get it working with python3, so that's something. Our test device is a simple LED hooked up to a photocell so that as it gets darker the LED gets brighter. The raw analog input from the photocell is written out to the serial port. On the python end, we simply read off the serial port and update a graph. All code is available on github.

Arduino

The arduino setup is fairly simple and mostly follows the tutorial supplied by adafruit.com listed under 'Simple Demonstration of Use'. Interfacing properly with the serial port, or, rather, getting python to correctly read the data from the serial port, did require altering a few things in the arduino code. Full source here.

The first thing to notice is our byte val = 0; at the top of the file. This is the variable that we are going to use to store the raw analog value off the arduino and I couldn't get it to work without it reading into a byte directly (as opposed to an int). Other than that, there is not much different from the adafruit tutorial except for the fact that I split the bulk of the code into a separate function adjust_led.

void loop(void) {
  val = analogRead(photocellPin);
  Serial.println(val,DEC);
  adjust_led(val);
  delay(100);
}

void adjust_led(int photocellReading){
  // LED gets brighter the darker it is at the sensor
  // that means we have to -invert- the reading from 0-1023 back to 1023-0
  photocellReading = 1023 - int(photocellReading);

  if(photocellReading < 0){
    photocellReading = 0;
  }
  if(photocellReading > 250){
    //now we have to map 0-1023 to 0-255 since thats the range analogWrite uses
    // for our purposes we only map 250-600 to get a better light range. 
    // feel free to experiment here.
    LEDbrightness = map(photocellReading, 250, 600, 0, 255);
  } else {
    LEDbrightness = 0;
  }

  analogWrite(LEDpin, LEDbrightness);
}

Pretty standard stuff. In the loop we read the analog pin and write it out to the serial port with a newline and using a DEC format. This is all that is needed in order to plot from python and the rest of the arduino code deals with adjusting the lighting on the LED. Two things to note: if the photocellReading is less than zero we bump it to zero; if less than 250 we don't turn on the light. So, ideally, the LED should stay off until it is "sufficiently" dark and then it should turn on and proceed to get brighter as it gets darker. The code is not perfect. For some reason I see some flickering and spiking going on and the overall transition isn't as smooth as I would like in terms of the visible effect of the LED. However, since our purpose is to get a graph going, and because looking at the graph might help us debug what is going on with our arduino, we just ignore all that for now and go on.

Python

There are two python files that we deal with. The first, SerialData.py, is fairly generic and should be able to use any serial data from an arduino. The second, light_sensor_plot.py, creates our PyQt gui, deals with our data, and creates the graph.

SerialData.py

The code is lifted almost directly from this tutorial (on his github this file is called Arduino_Monitor.py). A few differences:

        buffer = buffer + ser.read(ser.inWaiting()).decode()

I had to add .decode() while reading the buffer to get anything to work. I think this is a result of the difference in how python3 handles byte values versus python2.

        if not self.ser:
            return 0

I return a zero (0) instead of 100 if our serial port is not read. Ideally shouldn't even get here. I also remove the line that prints "bogus" on the ValueError exception. See my full code.

light_sensor_plot.py

Most of basics of this file are lifted from Chapter 6 of Sandro Tosi's Matplotlib for Python Developers. In that chapter you will find a section called 'Real-time update of a Matplotlib graph' which does what it says on the tin. In this example Tosi is graphing some cpu values from psutil and all I really do is strip all of that out, get our data from SerialData.py instead, and graph away. A lot of Tosi's code deals with the cpu values whereas for our data we don't need to do any additional processing. Perhaps the biggest difference is that Tosi is reading values once per second for a maximum of 30 seconds and sets up his graph accordingly. Our graph, however, wants updates more than once per second but also wants to scroll the graph accordingly. The update value is easy enough:

self.timer = self.startTimer(100)

This will give us an update every 100 ms, which should be smooth enough. The scrolling graph turns out to be just as easy:

        # force a redraw of the Figure - we start with an initial
        # horizontal axes but 'scroll' as time goes by
        if(self.cnt >= self.window_size):
            self.ax.set_xlim(self.cnt - self.window_size, self.cnt + 15)

So all we do here is check to see if our iteration count is greater than our window_size (30) and if so we set the x limit to be a 'window' that follows the data, with window_size space behind and 15 spaces ahead.

Other than that there is not actually much to the example. I think ideally I would like to split it up so that we have a file for reading from the arduino (currently SerialData.py), one for generating the gui, and one for dealing with the context specific data. You can see that our gui is really simple. Tosi and Eli go on to add fancy naivgation bars and all that so you can build from here.
Real-time graph of serial data values read from arduino.

Resources