Keywords:

Summary

This is a demonstration of looking at an exo-atmospheric object from a ground based sensor. In this case the object is an object that looks like the Moon (but is much smaller at only 16.8 km meters across) in a geo-synchronous orbit.

Details

This "moon satellite" is modeled as a built-in sphere (via the ODB file) that uses the native UV mapping of the sphere to wrap a moon texture around it. The scene also contains a built-in sphere for the Earth. The Earth object is not directly imaged, but it casts a shadow on the moon sat (an "eclipse" of sorts around midnight). The "moon satellite" object sees the Earth object as a background (or what we might call "earthshine" onto the object).

Important Files

Geometry and Materials

The "moon" objects is defined in the geometry/moon.odb file using the built-in SPHERE object geometry. The texture map for the moon can be found in maps/moon-surface1k.pgm.

The Earth model (behind the sensor) is defined in the geometry/earth.odb file using the SPHERE object geometry.

These two objects are then combined into a scene in the geometry/scene.odb file:

DIRSIG_ODB = 1.0

# Position the earth below observers feet (offset by 4 meters down to
# ensure it doesn't interfere with the camera).  The scale is to squash
# the round sphere into an oblate spheroid.
OBJECT {
    ODB_FILENAME = earth.odb
    UNITS = METERS
    INSTANCES {
        INFO = 0, 0, -6378140.0, 1, 0.996647189, 1, 0, 0, 0
    }
}

# Position a lunar texture on a sphere at 35,000km altitude (geo orbit)
# the angular extent of this object is about 5.6e-4 radians.
# The extra rotations rotate the lunar texture to look like it does
# from the USA
OBJECT {
    ODB_FILENAME = moon.odb
    UNITS = METERS
    INSTANCES {
        INFO = 0, 5988075.458, 34483952.098, 70, 70, 70, 90, 0, 270
    }
}

Platform and Tasking

This simulation uses a platform with a simple 320 x 240 (QVGA) camera that has a read-out rate of 0.00055555556 (a period of 1800 seconds or 30 minutes). The demo.tasks file defines an instantaneous capture and the video.tasks file defines a 24 hour (84000 seconds) collection window, which produces 36 capture frames.

Setup

Single-Frame (Still) Simulation

To run the single-frame simulation, perform the following steps:

  1. Run the DIRSIG demo.sim file

  2. Load the resulting demo-t0000-c0000.img file in the image viewer.

Multi-Frame (video) Simulation

To run the multi-frame simulation, perform the following steps:

  1. Run the DIRSIG video.sim file

  2. Load the resulting demo-t0000-c0000.img, demo-t0000-c0001.img, etc. files in the image viewer.

This simulated collection lasts a day, with a platform that captures an image every half hour.

Results

Single-Frame (Still) Simulation

The single-frame simulation produces a single image frame.

demo
Figure 1. Output of the single-frame simulation.

Multi-Frame (video) Simulation

The imaging instrument is setup to use the "file per capture" output schedule. As a result, the simulation produces 36 separate image files for the 36 captures. The animation below was created from these 36 frames.

video
Figure 2. The output of the multi-frame simulation.