Tuesday, November 28, 2017

RaspPi + SenseHat + Headless + What's my IP?

Camera box setup for PAN001
showing the arduino + hub +
DHT22 setup. Also a tangle of wires.
I'm playing with a little RaspberryPi3 + SenseHat. The ultimate goal for the setup is to replace the electronics in the camera box (see pic) with this simple setup. However, while testing I want this Pi plugged into our wired network which, unfortunately, doesn't have a static IP address. Since I want to measure power levels and generally don't want to deal with having an HDMI display around I wanted a quick way to get the IP should it change.

Quick solution consists of two components: one is a bash script that gets the IP address, the other is a python script that displays the IP address on the 8x8 led matrix provided by the SenseHat.

Code samples are listed below. Sorry about the lack of syntax highlighting and formatting. Blogger doesn't make that super easy for me and I don't have time to figure it out.

Video in action!




Added to $HOME/.profile:
/home/pi/get-ip.sh &
---

get-ip.sh:


#!/bin/bash
IP=`ifconfig eth0 | grep 'inet ' | cut -d ' ' -f 10`
/home/pi/show-ip.py $IP
---

show-ip.py:

#!/usr/bin/env python3

import os
import sys
from sense_hat import SenseHat

show_ip = True

fn = '/home/pi/show-ip'
# Don't want to run program twice
if os.path.exists(fn) is True:
    sys.exit()
else:
    with open(fn, 'w') as f:
        f.write('')

try:
    ip = sys.argv[1]
except IndexError:
    print("Must pass an IP")
else:
    sense = SenseHat()

    while show_ip:
        event = sense.stick.get_events()
        if len(event) > 0:
            break
        else:
            sense.show_message(ip)

    sense.clear()
    os.unlink(fn)

Saturday, November 18, 2017

Planetary Frontiers Workshop - Desert Fireball Network

I had the chance to attend (most) of the 2017 Planetary Frontiers Workshop here at Macquarie University. The aim of the workshop is to bring together folks from across Australia who are working in planetary science. Traditionally this has been a lot of solar system folks although this is being expanded out to exoplanets. Specifically, the focus for the coming years will probably turn to the formation of terrestrial planets and meteors/comets in general.

As such, the workshop had a lot of geologists. Lots of talk about chondrites and other such things I know nothing about (although there were cool titles, such as "Giant convecting mudballs of the early solar system"). Thankfully it was split up so that most of the hardcore geology was on Thursday while the astronomy side of things mostly took place on Friday.

One thing of particular interest on Thursday night was the folks from the Desert Fireball Network. I've heard about their work before and have been interested as they have units spread out all over Australia in the same fashion as PANOPTES hopes to accomplish. Check out their map, which is pretty impressive. Talking to them they said most installations are just on farm land and that the farmers have been very supportive of hosting the units.

Also interesting for PANOPTES is that most of these units run on solar installations and also feature built-in Telstra wireless connections, for which the organization has a monolithic (i.e. 40+ sim cards) sort of "family plan". They have a model whereby the computed coordinates of the fireball are delivered wirelessly on a daily basis, which is minimal data, while the raw data is stored on 3 separate 10 TB hard drives that are offloaded manually whenever the unit undergoes maintenance (every 6 months to 1 year depending on ease of access).

Overview of the DFNEXT box. Nice and neat.
The design of their box is also pretty impressive and incorporates a few design elements that PANOPTES could benefit from. In particular, they have a tiny 5V fan that is hooked up to a 3D printed plastic that directs air over the front of the lens in order to deal with condensation issues. They've also got a custom printed PCB board that they use for power relays that seems well thought out. A few details and pics here although unfortunately nothing highly technical.

There box is also a piece of metal aluminium that they send off to be cut by a private company but which only costs ~$15. They provide the CAD files and say it is much more efficient than having users machine their own boxes.

All in all the DFN is a slightly different model from PANOPTES. They build the units in house (4 or 5 PhD students, full-time staff member, etc) and then send the product off to the users, who host and maintain the box. Still, could be a valuable partnership from which we could learn a lot.

The rest of the workshop was also fruitful and good for learning about the community here in Australia.








Friday, November 10, 2017

PANOPTES - Diving in

James has done a lot of work to get PAN006 up and running at Wheaton College in Norton, MA and this morning we attempted to do a polar alignment test so that we can drive the unit toward first light. Things went...sort of.

First, big thanks to Sean and Joe at Wheaton for helping out with the procedure. We had James remotely operating the unit via ssh while looking at webcam, Sean on the inside communicating with James and I on a Hangout, and Joe doing the dirty work (read: -1.4 C degrees cold work) in the dome itself. The relay switches aren't quite hooked up yet so Joe was mostly acting as a control in case things when horribly wrong. In the end, they went wrong, just not horribly.

Clearly some issues with the hacky web interface I created for doing a remote polar alignment, so we skipped that and had James manually driving. First we tested some basic Home->Park->Home commands to make sure the mount goes more or less where it should.  Despite one large bad sounding noise and some friction on the RA axis, not to mention some self-strangulation by the Dec. cable, we were able to test the basic mount movement commands. Having this accomplished we were finally able to get the procedure up and going only to be stymied by the inability to do a plate-solve on the images. Getting late, that is where we left things.

So now I am in the lab for the first time in weeks, remembering what a disarray I had left things in. 

"It is better to take what does not belong to you than to let it lie around neglected." Thankfully Mark Twain hasn't visited my lab.

Two simple goals for today: 1) try to reduce the startup time for the POCS shell, mostly by limiting the time the cameras take to do initialization, and 2) try to fix the solving.

I already passed some info off to James to play with the solving on his NUC so I'm checking that box for the day. Now to clean up this lab bench and get the cameras up and running quicker...

EDITED: 1) Turns out gphoto2 can take all the options at once, so we were able to drastically speed up init via this PR.

Wednesday, November 20, 2013

Exoplanet Light-Curves with OSCAAR - Photometry for Everyone!

For our astrophysics club we have been doing some observations on Faulkes North over on Maui and managed to capture half of a transit before the weather shut down the telescope. At this observation session, we went through some tutorials on generating light-curves and discussed the general methodology (read about our Observation Observations) but I had wanted to find a more robust, and ideally easier, way for members to be analyzing the data in the future. After flirting with the idea of writing a comprehensive program I got some better sense and went looking for someone else who already had. Lo and behold, OSCAAR.

OSCAAR is a simple-to-use GUI that assists in the development of light-curves for exoplanets (and other eclipsing events). In their words, "oscaar is an open source project aimed at helping you begin to analyze observations of transiting extrasolar planets with differential photometry," and, "oscaar is useful for observers at small and large observatories alike, and users with any level of coding experience. No coding experience or PhD in astrophysics necessary!" Or, like the title says, "Photometry for Everyone!"

However, since the software is still in development (2.0-beta at time of writing) there are inherently some problems and I very quickly encountered these. Here are some initial notes about using and solving these problems, which will continue on into the future.


  • Initially the program required dark frames and flats. I created an Issue regarding this limitation, to which the authors immediately started working on a fix. Within a week or so this was adjusted to create a dummy dark and flat image (basically an array of same size as image consisting of 1s and 0s, respectively). The interface could probably use some work and will be contacting authors regarding this issue but overall I was very please and happy with the quick response. Thanks!
  • Region file must be in coordinate system (format: saoimage) or dies with bad error. This was not obvious nor the DS9 default so there was some confusion and digging on my part.
  • Exposure time is set in header as ‘DATE-OBS’, which should be selected (?) automatically. Had to go into Extra Observatory Parameters and select ‘DATE-OBS’ (which was the only option given) for it to work.
  • If selecting Tracking Plots and/or Photometry Plots, plots are given as running and seems to work well (see pretty pictures).
    Tracking and Photometry plots being generated while running.
      
  • When trying to do a fit a bunch of array output is given and then an error. Digging into it, it is a dimension error. Something about the comparison stars when doing a np.dot versus what is passed into the leastsq. Digging into the problem, it seems to be related to the issue seen here, with leastsq not knowing how to properly optimize a multi-dimensional array.
    • Well, after digging around for a while, it turns out that I just needed to use less comparison stars. I’m not sure if this is a bug or designed that way and will contact authors. Documentation refers to using as few as possible in order to increase speed but seems like an error should be handled a little better.


  • Half a light curve with fit for TrES 3-b
    After processing, I see a graph displaying my light-curve (well, I only have half a curve). The dialog box for the Dark and Flat are still displayed, which is a little confusing, it should close earlier. Then I am taken to a box asking for my result file names with a button for an MCMC Fit. I’m just accepting the default values for all the parameters, of which there are a lot.
  • This gives me an error finding analyticalTransitModel.so, which is a linked library, so might just be a problem with me running from the repo directory rather than installing.




  • Trying again the next day, I used the previously installed copy of OSCAAR (instead of the version from the repo) and had no problems with the analyticalTransitModel.so. I was able to download a copy of the exoplanets database (by entering Planet Name and choosing Update Parameters). This ran for 100 iterations, looking fancy and science-like, but then gave me an AssertionError saying that there was not successful optimization and to try new Beta values. Certainly, the error reporting could be a lot more user friendly. It suggests updating the Beta parameters but I really have no idea how to adjust them to get more (or any) appropriate results.

MCMC fit and plot. This is in the middle of the process which eventually errors out.


All in all I am impressed with the work done so far and look forward to working with the authors to help improve OSCAAR. I think there is some general clean-up work that could be done with the GUI (I notice there is a branch for a new GUI) in terms of using as well as when giving errors. I would also like to see a python3 version. The website talks about the dependencies but I think the only thing not ported to python3 directly is PyFITS and that has been rolled into astropy and should be working well so perhaps the port to python3 could happen soon (yes, that is sort of me volunteering).

Sunday, October 13, 2013

International Observe the Moon Night

Stand about, look, let your eyes see the moon.
Had a good night out at the VIS checking out the moon. The University Astrophysics Club at UH-Hilo was up doing some LADEE Lunar Impact Monitoring for the Pacific International Space Center for Exploration Systems (PISCES). InOMN encourages everyone to take notice of our nearest celestial neighbor by simply going outside and looking up!

Sunday, July 21, 2013

2013 Lunar Science Virtual Forum Overview

You can't sequester science.

At least that's the idea coming out of the 2013 Lunar Science Forum from the  NASA Lunar Science Institute (NLSI). The 6th annual forum saw a number of changes, starting with a (budget-enforced) switch to being a completely virtual forum all the way to the announcement that the Lunar Science Institute is now the Solar System Exploration Research Virtual Institute (SSERVI). One can extrapolate from the organizational name change that the focus will not only be the Moon (think small-bodies like asteroids) but will also be virtual from here on out. My thoughts are that this is a good thing as the NLSI really seems to be trying to bridge the work between scientists, universities, and governmental organizations. There were a few questions throughout the forum asking about co-operation with the private sector although that discussion was, perhaps surprisingly, limited.

The SSERVI crew posing after a well run #LSF13
The SSERVI crew posing after a well-run virtual forum.
This being an all-virtual forum there were naturally some challenges and issues. Overall, the folks at SSERVI (I believe most of them were at AMES) did a really great job with most of the technical issues while the problems seemed to stem from participants who failed to pay attention to instructions. The agenda includes links to the entire archived conference, which can play back in real-time including all the chat features and user blunders. It's kind of cool actually. Definitely one of the advantages to an all virtual conference is that things like a complete archive of all the session is readily attained. Indeed, you can even check me out in the Lightning Talks here. :)

One can easily garner the gist of current research and focus by looking over the agenda titles with lunar geology dominating most of the sessions (naturally), although there was a fair sprinkling of volatiles, exospheres, human exploration, heliophysics and other misc. items scattered throughout. The individual speakers ranged from extremely fascinating to hard-to-understand-for-various-reasons to dry-n-pedantic and the variation was largely independent of the subject matter.

The basic format was to have one or two general speakers do an introduction of sorts to the subject matter for the day. These were largely interesting and well done, giving brief overviews of the relevant issues and subject matter. For instance, the second day opened up with some quick introductions to geology and volatiles as they are present on the moon. Dana Hurley, of the John Hopkins Applied Physics Lab, began with an insight about how when she started in the field 15 years ago the questions would have been about whether or not water existed on the Moon, a question we now have thoroughly answered. The questions now, she asks, are: 1) What are the present day abundances of volatiles and 2) what is the actual composition of these volatiles? You can see her concise overview talk here but it seemed to reflect the general consensus that we have learned a whole lot recently and are about to learn a whole lot more.

Throughout the forum (I keep wanting to say conference) there was a lot of talk about LADEE, which is natural as it is about to launch (window opens Sept. 6 - see launch details) and will give a lot of science to most of the people attending the forum. It's interesting to see all the different components with LADEE and I can't seem to shake the feeling that everyone is trying to simultaneously pat each other on the back as well as up-sale the LADEE mission in an effort to convince people it is worthwhile. Actually, this is more the feeling I get out of everything that comes from NASA these days, which makes sense as they face large budget cuts and a public that is skeptical of spending money on things they don't understand. Including the need for lunar forums.

On a personal end, and as mentioned, I did participate in the Lightning Talks, with Krystal giving a good overview of the hard work she has been doing that I have been helping volunteer with. I also gave a quick summary of some of my plans to create an easy to use data acquisition black-box kind of thing for the lunar impact monitoring (details later). Overall, it was nice to get the (brief) exposure and a good chance to tout both PISCES and UH-Hilo.

SSERVI's concluding note highlights a few areas of interest, including the awards ceremonies, and links to the relevant areas. Below are a few talks that I thought were cool, in no particular order:

Sunday, July 7, 2013

Imiloa Talk - Observing the Center of the Milky Way at 45,000 feet

Having lived in Hilo for the last couple of years, and on the Big Island for over four years now, it is a bit saddening that I have never bothered to become a member of the Imiloa Astronomy Center nor really attend many of the awesome events they regularly put on. Today, however, I had the opportunity to change that by attending an interesting talk by Ryan Lau, a graduate student out of Cornell University and a native Hawaiian resident, discussing his graduate work with infrared astronomy aboard NASA's SOFIA, a modified 747 designed to operate at 45,000 feet.

One of SOFIA's recent flight plans
Lau did a really great job of giving a brief overview of his work and of infrared astronomy in general, opening with an anecdote about his realization that the Sagan Walk, which originates at Cornell University, has the final piece of installation - representing the nearest star to our sun, Alpha Centauri - right here at the Imiloa Center, a project just completed this last December. As I reflect on it afterward, it is interesting to me that his diagram of the Sagan Walk, with the diagram arrows stretching nearly 8000km across the United States, ended up looking similar in spirit to his later slide depicting the flight paths taken by SOFIA as it flies across the United States. The contrast, of course, being that while we have mastered the ability to fly ourselves easily across our own globe, the thought of travelling to our nearest star, let alone something like the galactic center, still remains as daunting of a concept as the flat world was to initial explorers.

Any talk that is given to the general public about a highly advanced and specialized subject necessarily skips over a lot of the details but I thought that Lau did a really good job both in his presenting style, his ability to answer questions, and in his overall format of the speech. As he highlighted in the beginning, he wanted listeners to come away from the hour with three key points:
  • Infrared is awesome.
  • Airborne astronomy is awesome.
  • The galactic center is awesome.
A look inside SOFIA. Lau is in bottom left.
With that in mind, Lau guided us through a brief but thorough overview of infrared astronomy, including the current ground and space based systems (R.I.P. Herschel) as well as the problems associated with each, with atmospheric conditions mostly affecting the former and prohibitive cost/effectiveness the latter. So onward to the second key point, that of airbone astronomy and SOFIA.

There is literature out there (follow the links above) describing SOFIA so I won't go into too much detail other than to say that it appears to be a pretty cool setup. Flight times are typically on the order of 10 hours, with nearly 5 hours of science accomplished on each flight. A typical session will last about 3 weeks, with 2-3 flights happening each week, for a total of 10-15 flights per session. The latest session, which Lau was on the plane for, had ended the previous week and had included some exciting new images of the galactic center, which is the thrust of Lau's work and his third key point.

The Galactic Center is Awesome
Specifically, Lau has been using the Faint Object Infrared Camera for the SOFIA Telescope (FORCAST - playing fast and loose with acronyms) to look at two main regions of the galactic center, that of the inner 10 light years and the gaseous torus of dust that surrounds the black hole there, as well as that of The Sickle, which is a dense region of recent massive star formation and the site of three identified Luminous Blue Variables, extremely bright stars that can undergo unpredictable and radical shifts in brightness and spectra. In addition to being extremely bright, these stars are also extremely rare, with only 20 of them having ever been confirmed. To have three of them in one area of focus is quite extraordinary and the latest pictures generated by Lau also showed some pretty exciting properties that have never been observed.

One of the really fascinating things for me when I see stuff like this is that the science is a work in progress. Lau had literally taken these images last week and revealed some pretty incredible and awesome pictures of the two focus areas - images which aren't released to the public yet - and is still in the active process of interpreting and analyzing the results. While I would love to spill the beans on this by talking about these cool properties, I'll go ahead and wait until Lau has published his work before highlighting anything. Needless to say I thought it was pretty cool.

As mentioned, Lau did a good job of explaining these concepts and some of his results and overall just did a really thorough and concise job of giving a run-down of his recent work. Thanks again to Ryan Lau and the Imiloa Astronomy Center for the opportunity.