Tag Archive: “geography”
I’ve been playing around with ideas for a new satellite tracking web site. The main idea is to supplement a conventional 2D ground track map (like what I made for Where’s That Sat, or as often seen on the big screen at mission control) with a 3D view that displays the same orbits as paths swept out around the globe. I believe that seeing the actual shape of an orbit in space helps you to interpret the corresponding flat representation. So, there is even an eccentric pedagogic purpose to this project: it is an instructional aid for developing orbital map-reading ability.
Here’s an example:
The code to generate this image is crude, but the results are already encouraging.
Posted on Thursday, July 11th, 2013.
Arcencode is my Tcl implementation of the compression algorithm used by both MapQuest and the Google Maps API to encode a list of coordinates as a printable string. It can be used, for example, to obtain a relatively compact representation of a route that can be passed as a single parameter value to a web map.
Here is a GPX file exported from a map created with MapMyRun. It contains a track representing a route through a nearby park, pictured above. The track consists of 790 points, listed here in plain text form. If the contents of that plain text coordinate list are assigned to a variable named
coordinates, the encoded form can be obtained with:
set encoded [arcencode $coordinates]
With line breaks added, the contents of
q}g`GrlfnMPDND\DJ?L?T?P?N?N?N?P?VAPANANAPANAPANATGNELGXENANANCNANA^BNDPDNBNDNDND NDPDNDNBNDNDPDNDNDNDVHNFPFNFNFNFPFNFNFNFPFNFNFNFPFNDNFNFPFNFNFNFVFNBPBNBNBNBVFND PDNDPDNFPDNDPDNDPDNDPDNDPDNDPDNFPDNDPDNFPFNDPFNFPFNDPFNFPFNDZLHFJDNFPHNFNHPFNH\F LAJ?V?F?PFNFNDNFNHZNLHNHLHLHLHNJLHLHT?N?PAN?PAPAVDPFNDNDPDNFNDPDNDNFPDNDPDNFNDPD ND\JLDNBNDLDNDNBTFNDLBLDVDN@PBN@PBN@PBNBP@XHNHLFNHNFNFLHNFNFPVBNPVJFHHRLHFPANCPC PAPAR?L@R@P@P?N@P@P@P@P@VFLDLFRHNFNFPFNFNFNFNFNFNFTLHFLPLNJLRTJHPJJDPBTAJCPGPGNE PGNGPGNGNEPGNGPGNEPGNGNGPGNEPGZIJAJAVJLDLDRAPENCPCNCPENCPCNEPCXGNCLCNCNCPDPHTTJJ HLHJLNJLLLRLLBPFRNJLJJJLJLTHL@T@L?N@THLHLHLHNLNJLLLJNJ\LL@N@N@L?N@ZALENCLCNENCRI NGNELGZKNGNGNGNGNGNGNENGNGNGNGNGNGNGTUFQHOHOFQN[HOFMPYHKHINQJKLLHRHPHPHRJPHPHRHP J^BPDRDPDVJVHNHPPNNJLHLHLJRFPFNDNDPFNDNDPFXHLFLDLDRFNFNDNDRBNBN@RENCPEPCNEXCH@P@ VJJHHHFBEWCSEUGSIUO[KOIMIMUWKMMKMMKMMMMKKMMMMMSMOGMIOIOGOIYQMIMIMGMIKIUMMKOIOKMI OIOKMIOIOKOISKOEMGOEUCOCQAOAQCQAYCMAMCSEOEOEQCOEWCM@WBMBOBODOBUJMFMHMHMLOJMLWPMF MFMFWFMBOBODYBO@O?O@O@O@SCOAOCQEOEQEQEOGQEOGWGMAU?M@O@QDODQBWFQBOBO@QBOBOBQBOBSB O@QB[DOBOBODOBOBOBOBUDQDOBOBQDOBOBW?KEQGMGQ@QBQFOD_@NODOFODOFOFODOFODOFODOFOFODO FODYJK@Q?SGKCOMKKQQKMKOKMOKUMOGOGOGOGOGOGOGQGOGWKIEOGWEQ?OAQAOAQ?OAQAOAQ?W?SGIIK GKOKQMOOMMISCUHGFQGOGOIOGOIOGOIQIOGUGQAOCQAOCQAOCQAOC[GOEOEQEOCOEOEOEOEOEOESEOGQ EOEQGOEQEOEQGUGOEQGOEOEOEQGOEU?Q?O@Q?O@O@YIOIMKMIOIMIOIMIMKQGQGOGOGOGYEM?M?Q@[GM GOGMGMGMGUIMGOGWMKEMCQIQEOGOGQEOGOGOGQE[KQEOEOEQEOEOEQEOEOEQEOEOEQEOEOEQEOGOEQEO EWEOCQCOEOCOC_@KOGOGOGOEOGOGQGOGOGOGOGOGOGOEOGOGOGOGOGOGOGWGOEQEOEQEOEQEOEQEOEQE OEQEOEQEOE[AK@M@Q@Q@OBQ@QFOFOFWBO@Q@O@Q@O@Q@Q@U?O?Q?Q?O?Q?Q?O?]CKEMCME
Succinct, considering it encodes detailed geometry. A coordinate list of 790 corresponding points can be recovered from this block of text with the complementary
set decoded [arcdecode $encoded]
decoded coordinate list can be examined here.
An important proviso: this encoding scheme is lossy. Specifically, coordinate values are rounded to five decimal places. Compare the input coordinates to the decoded output for an example. Any precision can optionally be specified, but greater precision compromises the amount of compression. (Note that the Google Maps API is only compatible with five-digit precision).
How does this algorithm work? Google offers a technical step-by-step explanation. Essentially, compression is achieved by storing only the difference between each coordinate value and the previous value in the sequence; this requires fewer digits than storing each value in full, especially since precision is limited to a fixed number of digits. The values are packed into a printable Base64 representation in a way that eliminates the need for delimiters between coordinate values: values are bit-shifted left before output, and the ones place is used to mark whether the subsequent character represents a new coordinate or a continuation of the same value.
Posted on Sunday, March 3rd, 2013.
GIS Stack Exchange is a question and answer site dedicated to helping people figure out GIS (geographic information systems) problems. It’s part of a larger Stack Exchange network of Q&A sites about various topics.
I received some helpful replies in response to a query I posted earlier this year, and have found useful information there as a result of other searches, too. So, in an effort to “learn by teaching”, I’ve decided to try contributing answers on a regular basis (at least weekly). I’m not qualified to address most of the topics that come up, but there is a large backlog of unresolved questions to peruse, and I’ve already found a few of interest. Even if I can’t provide authoritative answers, I hope that providing pointers to relevant references might help people figure things out.
Here are the comments I’ve offered so far:
- Cartograms: making base maps and facilitating size comparison (previously, here)
- Shapefiles: clarifying whether polygon features may intersect themselves (no)
- Projections: finding an implementation of space oblique mercator (PROJ.4 supports a Landsat-specific subset; sparked some more ideas for gtg)
One of my answers even got an upvote! So, yay. My geography degree was good for something.
Posted on Tuesday, December 11th, 2012.
My Twitter bot @WheresThatSat is up and running. More information about what it does is available at WheresThatSat.com. In short, it replies to comments about satellites with maps and information about the satellite’s recent course.
Posted on Wednesday, April 25th, 2012.
As I’ve mentioned a few times, I’m making a bot called WheresThatSat which is basically a Twitter interface to Ground Track Generator, my satellite-path-mapping program. The bot responds to queries about satellites (it knows of many – you might even say it has detailed files) by reporting their location at the time they were mentioned.
This week I’ve been making a complementary web site that displays more information (altitude, speed, heading, etc.) along with a Google Map rendition of the satellite’s recent path. The bot will include a map link with each response. The site isn’t finished yet (some icons and styles are still placeholders), but here’s sneak peak:
My goal is to get things working smoothly enough to let WheresThatSat resume running later this week, at least on a trial basis. Although the bot could search for and reply to any mention of the many satellites it knows about, I’ve decided it will only post unsolicited responses to a sample of tweets about one or two “in the news” satellites (queries explicitly addressed to @WheresThatSat will, of course, have access to a full catalog of satellites). This is partly a matter of manners and partly a matter of avoiding excessive API calls (Twitter imposes rate limits on how frequently programs can interact with it).
Posted on Sunday, April 22nd, 2012.
A ground track comparison of Sun-synchronous satellites in low Earth orbit and geosynchronous satellites in high Earth orbit.
Posted on Tuesday, April 3rd, 2012.
I wrote a little program to make shapefiles (GIS map layers) of satellite ground tracks. Here’s the story of its development, recounted from my comments on Twitter (the internet’s water cooler).
Posted on Saturday, March 31st, 2012.
Here is a presentation I created a few years ago to accompany a little talk a gave to the lab group I was working with at the time. Members of the group had acquired some ad hoc GIS experience, but I felt they would benefit from a higher-level overview of common “geographic information systems” concepts and operations. The presentation touches briefly on a number of topics and includes a variety of example images (mostly uncredited, unfortunately; intended for educational purposes only). I have omitted the final slide, which was a segue into a discussion of specific projects within the group. I hope you will find the rest of the slideshow presented here useful.
Posted on Thursday, February 23rd, 2012.
Processing is a system that makes it as straightforward as possible to do some pretty sophisticated graphics programming. Based on Java, it abstracts enough technical details to let you focus, more or less, on the basic logic of the idea you want to animate. From the web site:
It is used by students, artists, designers, researchers, and hobbyists for learning, prototyping, and production. It is created to teach fundamentals of computer programming within a visual context and to serve as a software sketchbook and professional production tool.
Check out the Exhibition for some examples of what’s possible and the Tutorials to see how easy it is get started. There is a great collection of examples for specific topics, too, most of which include illustrative applets embedded in the page. The ability to export Processing programs (or “sketches”) as applets is particularly appealing, although my understanding is that some features, such as file I/O, are available only in application or development mode. It works cross-platform.
I know I have encountered Processing before, but my current interest began as I read Andy Lynch’s description of a simple LDraw renderer he implemented as a Processing sketch. That lit a fire under some related ideas of my own that have been simmering for want of an optimal outlet.
But there’s more to my interest than digital bricks: if there isn’t already a decent library (which would be surprising, as many useful libraries seem to be available), I might be tempted to write a shapefile loader, if for no other reason than to complement the shapefile parser I once wrote for Chipmunk Basic. I think it could be fun to experiment with some raster GIS and remote sensing ideas in Processing, too. (Just get the spectral signatures – click, click, click – and you do it. That’s all what it is!) Last but not least, per its original intent, I can envision using Processing as a superior tool to visualize certain data.
What sort of Process will you invent?
Posted on Monday, January 11th, 2010.
The big outcome of our initial presentation of BNP data was the development and administration of a survey to assess general social attitudes and specific attitudes towards certain city initiatives among residents of a certain area in the city. The survey was intended to inform groups involved in that initiative, but it also complements existing data and the lead author’s research interests.
Of particular interest to me was the opportunity to tag some map-based questions onto the survey. We provided a street map (courtesy of the Goog’s cartography gnomes), and we asked respondents to highlight the streets that comprise their neighborhood. Here’s what it looks like:
In survey parlance, this is a “pilot instrument” in at least two regards. Firstly, we were uncertain how readily respondents would understand what they were being asked to do. Secondly, we were uncertain what the best way to analyze the responses would be.
The first concern has proven to be a non-issue. Drawing on a map is fun; I think people found it the most engaging part of the survey. Furthermore, they tend to reason aloud about what makes up their neighborhood as they mark their stomping grounds. These narratives are interesting.
(Yes, I participated in the door-to-door administration of the survey. It was a fascinating experience which deserves further elaboration.)
The second concern, processing the responses, has yet to be fully addressed. The initial plan was to scan and digitize each map for exploratory GIS analysis. What has actually happened to date is that we’ve numbered each major street segment in the neighborhood, and coded each hand-drawn map according to which street segments it includes. This data will inform a statistical social network analysis of the neighborhood.
Anyway, the survey — of the neighborhood surrounding Mary Street and the straw bale house that will be built there — has generated quite a bit of interest. So far we’ve presented the results three times (to the Binghamton Housing Authority’s South Side Alive committee, to the Mayor’s office, and just today to the South Side West neighborhood assembly); a quantitative approach to neighborhood planning and assessment seems surprisingly novel and interesting to residents and officials alike.
Posted on Tuesday, September 1st, 2009.