As we saw in part 1 and part 2 of this series the typical measurements of sky brightness in Providence are between about 4.1 - 4.3 nelm (naked eye limiting magnitude) on clear nights. Here is a graph that shows a typical hazy summer night. The readings were taken on the night of July 1st into the
morning of July 2nd of 2014 and are in the range that we commonly see. The dashed horizontal line is a somewhat arbitrary divider between typical and darker nights. When the sky brightness is below about 4.3 the observing is much better.
Looking at a graph of the sky brightness doesn't give an intuitive idea of what the sky actually looked like for observing. We can see this by looking at the wide angle views of the sky using the camera mounted on the roof. Here is a time lapse movie from the same night as the above graph.
Sunday, August 10, 2014
Saturday, August 9, 2014
Sky Brightness 2
In my previous post I began to analyze the data from the sky brightness meter at Ladd Observatory. Now we'll take a closer look at the broader trends. Here is a scatter plot showing the data from the summer and fall of 2013. The plot is a little busy but we're really only interested in the "bottom line" where the data points are at the lowest values. All of the nights are superimposed on one another with the x axis showing hours UTC. This graph summarizes how the sky brightness changes during the course of the night. The many values between 3.7 and 4.3 are due to nights that are more or less hazy. There moisture in the atmosphere scatters light from the city back down to us and causes the overall sky to look brighter.
If we follow the lowest readings there is a definite trend where the clearest nights start off at about 4.2 at the end of twilight and slowly, steadily, decrease to about 4.45 at 4 hours UTC. There is then a small but rather sudden drop to 4.55 after which the slow decrease continues until we are at about 4.6 in the early morning. I'm not sure what is causing the drop at 4 hours but it may be due to city lights that are on a timer. The takeaway here is that the sky is slightly, but significantly, brighter in the early evening. The best time to observe is after midnight local time through the early morning.
If we follow the lowest readings there is a definite trend where the clearest nights start off at about 4.2 at the end of twilight and slowly, steadily, decrease to about 4.45 at 4 hours UTC. There is then a small but rather sudden drop to 4.55 after which the slow decrease continues until we are at about 4.6 in the early morning. I'm not sure what is causing the drop at 4 hours but it may be due to city lights that are on a timer. The takeaway here is that the sky is slightly, but significantly, brighter in the early evening. The best time to observe is after midnight local time through the early morning.
Friday, August 8, 2014
Sky Brightness
"The sky above the port was the color of television, tuned to a dead channel."
- Neuromancer, William Gibson, 1984.
At the Ladd Observatory we operate a weather station and a number of other rooftop instruments to monitor the environment. One of them is a sky brightness meter. On a regular basis we use the live data to judge the quality of the sky for observing. It is also used to document long term changes such as the increase in light pollution.
Sky brightness meter and camera on the roof. |
The sensor is too sensitive to take a measurement during the daytime. It starts collecting data shortly after sunset when the sky begins to darken and stops during morning twilight just before sunrise. Last summer I calibrated the meter and we've now collected 300,000 data points in about one year. I thought this would be a good time to analyze what we have so far.
Monday, August 4, 2014
As the Bubbl Bursts
The invention of magnetic bubble memory was once seen as a revolutionary computer development - the wave of the future. It is now a nearly forgotten technology.
"Many persons expect that the most dramatic changes in digital systems will result from magnetic-bubble chips that could well hold a million or more bits in the not-too-distant future. Along with charge-coupled devices, these memories show promise of replacing magnetic tape and disks for small systems." [emphasis in the original]
- Understanding Digital Electronics, Texas Instruments, 1978One of the more unusual computer objects that I've collected over the years uses this memory. It is called the QSB-11A Bubbl-Board. I was told by the person that sold it to me that it had been used in a system at Los Alamos National Laboratory. I have no idea how it was used. Given the nuclear research conducted there I sometimes wonder if I should check to see if it is "hot."
Saturday, August 2, 2014
W9GYR
As a young child I can remember my late grandfather operating a ham radio station in Chicago using surplus military equipment that he obtained at the end of World War II. When I received my own amateur radio license about a year ago I began to wonder when he first became involved in the hobby. I suspected that he started before the war so I started to dig through old FCC publications which listed newly issued licenses. I couldn't find a single mention of his name or the call sign that he was assigned: W9GYR.
My first clue to narrow down the search was a website called Old QSL Cards which has a large collection of the postcards that amateur operators send each other to confirm that they had made a radio contact. QSL is early radiotelegraph shorthand for "I am acknowledging receipt" of a wireless message. They had a card from my grandfather that was dated 1939.
My first clue to narrow down the search was a website called Old QSL Cards which has a large collection of the postcards that amateur operators send each other to confirm that they had made a radio contact. QSL is early radiotelegraph shorthand for "I am acknowledging receipt" of a wireless message. They had a card from my grandfather that was dated 1939.
QSL card from my granfather from Jan. 26, 1939. Scan courtesy of Old QSL Cards. |
Labels:
Amateur Radio,
Rabbit Hole
Location:
Chicago, IL, USA
Thursday, July 31, 2014
"The Red Skies" of 1883
"It is impossible not to conjecture a connection with the volcanic eruption in the Sunda Straits, by which, on Aug. 26, the island of Krakatoa disappeared wholly from the face of the earth."
"The terrible nature of this outburst can hardly be realized: the sky was darkened for several days, the noise was heard two thousand miles, magnetic disturbances were noted, the tidal wave was distinctly felt at San Francisco, and the atmospheric disturbance was sufficient to cause marked barometric fluctuations, which were noted by the barographs on the continent, in England and America, for several succeeding days."
- W. Upton, "The Red Skies." Science, 11 January 1884
During the fall of 1883 there was a remarkable atmospheric phenomenon which "attracted great attention not only from the general public, but from scientific men, who have endeavored to give a satisfactory explanation of it." At the time that he wrote those words Winslow Upton had just accepted the position of Professor of Astronomy at Brown University. Prior to this he had been Assistant Professor of Meteorology in the U.S. Signal Service from 1881. The phenomena that he endeavored to explain were the "recent fiery sunsets" seen throughout the world.
The Scream (1893) by Edvar Munch (National Gallery, Oslo, Norway) |
The sight of the blood red sky seen at sunset may even have inspired the Norwegian artist Edvar Munch who "felt a great, unending scream piercing through nature."
Note: I originally published this on the Ladd Observatory Weather Underground blog in 2011.
Tuesday, July 29, 2014
Silicon to Supercomputer
The J90 logic is implemented using application-specific integrated circuit (ASIC) chips fabricated by IBM. There are 10 unique ASICs that are found in the processor and memory modules. A typical J90 system could contain about 230 of these CMOS chips. The photo below shows a processor module with the cover removed. Each module contains 4 scalar/vector processors. The space at the top of the board can be used for optional HIPPI interfaces or Y1 Channels to additional I/O Processors.
The ASIC chip types are:
There is only one chip (called PC) for each scalar processor and one additional chip (called VU) for each vector processor. There are only 8 chips on each processor module for the CPUs and the rest of the 18 out of 26 chips are used for communication between processors or between the processors and the memory banks. This circuitry is the key to a "balanced" system where the memory bandwidth is great enough to sustain the rate at which the processors can operate on the data.
A Cray J90 quad processor module. |
The ASIC chip types are:
- MBI - DRAM memory interface
- MAD - Memory side of memory crossbar for read data
- MAR - Memory side of memory crossbar for write data
- VA - CPU side of memory crossbar for write data
- VB - CPU side of memory crossbar for read data
- CI - Channel interface (I/O)
- JS - Shared registers for multi-CPU applications
- PC - Scalar processor and processor control
- VU - Vector processor
- MC - Maintenance and clock distribution
There is only one chip (called PC) for each scalar processor and one additional chip (called VU) for each vector processor. There are only 8 chips on each processor module for the CPUs and the rest of the 18 out of 26 chips are used for communication between processors or between the processors and the memory banks. This circuitry is the key to a "balanced" system where the memory bandwidth is great enough to sustain the rate at which the processors can operate on the data.
Saturday, July 26, 2014
The summit station at El Misti, Peru (19,200 feet)
"The night is passed at the hut, and the final ascent to the summit made on the second morning. This occupies several hours, as the animal stops to rest every fifteen or twenty feet at this altitude. On two occasions I was obliged to walk a short distance to cross snow which had drifted across the path, and realized the extreme difficulty of breathing during the exertion required."
"The effect of the altitude upon me was chiefly to cause headache, sleeplessness and partial loss of appetite. On one occasion while at the summit I experienced a decided feeling of faintness for a short time."
- Winslow Upton, Physiological Effect of Diminished Air Pressure, Science, 27 December 1901
Misti summit station, Jan. 5, 1894. Shut in by cloud [and] snow looking N. E. |
During the academic year of 1896-97 Prof. Winslow Upton took sabbatical from his work as Director of Brown University's Ladd Observatory. He spent ten months at the new southern station of the Harvard College Observatory (elevation 8,050 feet) in Arequipa, Peru. His primary goal was to measure the geographical position of the station before astronomical observations could commence.
During this time he also made four ascents to the summit of the dormant volcano El Misti, which was the site of recording instruments (pictured above) maintained by Harvard. At the time it was the highest meteorological station in the world at an elevation of 19,200 feet.
Thursday, July 24, 2014
Geared to the Stars
The telescope at Ladd Observatory uses a clock drive to compensate for the Earth's rotation and track the stars. Modern telescopes use electric motors but this one was built in 1891 before electric power distribution was common. The Observatory originally had gas lamps for lighting and the telegraph system was powered by "gravity cell" batteries. The telescope's mechanical clock drive is weight driven with the speed regulated by a centrifugal friction governor.
A closeup of the clock drive showing the governor in motion. An optical sensor and precision timer are used to measure the rotation rate. |
Monday, July 21, 2014
Power Up
The Cray J916 was featured at our monthly open house this past Saturday. We spent most of the day talking to visitors about the system and showing them the rest of our collection. We did make some progress in the morning before we opened. Dave took some photos while I was working.
Everything seems to be functional with one exception - there is a system clock PWR FAULT light showing on the Central Control Unit. We suspected a loose cable or board connector and traced the signal paths. We couldn't find the cause of the fault and will need to dig deeper another time.
Working on the system clock board on the J90 backplane. The Central Control Unit is in the foreground. |
Everything seems to be functional with one exception - there is a system clock PWR FAULT light showing on the Central Control Unit. We suspected a loose cable or board connector and traced the signal paths. We couldn't find the cause of the fault and will need to dig deeper another time.
Sunday, July 20, 2014
The Dawn of a New Era
When I was a young child I would watch reruns of the original Star Trek. It wasn't so much the space ships or aliens that impressed me. It was seeing human beings just simply standing on another planet that moved me. It gave me the idea that there were other worlds out there, and that you could travel beyond the Earth to visit them. That sparked my imagination.
My parents would then change the television channel and again I would see people walking on another world. But this time it was on the 6 o'clock news. A grainy video of astronauts in bulky spacesuits standing on a monochrome landscape with the crackling audio of a voice calmly saying "Beautiful, magnificent desolation." It was, arguably, one of the few moments in human history when reality was more amazing than our wildest dreams.
My parents would then change the television channel and again I would see people walking on another world. But this time it was on the 6 o'clock news. A grainy video of astronauts in bulky spacesuits standing on a monochrome landscape with the crackling audio of a voice calmly saying "Beautiful, magnificent desolation." It was, arguably, one of the few moments in human history when reality was more amazing than our wildest dreams.
Six months after Apollo 13. Photo credit: Mom, Halloween 1970. |
Saturday, July 19, 2014
Prismatic Analysis
One of the projects that I've been working on is the Brashear astronomical spectroscope from 1891. We're trying to recreate the ability to record spectra using the original photographic plate holder. Here is the wooden plate holder mounted in place of the eyepiece assembly that is used for visual spectroscopy.
A focusing screen was created by mounting the ground glass in a foamcore frame that fits the plate holder.
A focusing screen was created by mounting the ground glass in a foamcore frame that fits the plate holder.
Thursday, July 17, 2014
Jedi vs. the Droids
How does the performance of a 20 year old supercomputer compare to the devices that we use today? Let's compare the Cray J916 to a recent laptop and a smart phone.
According to an archived copy of the Cray J90 Series webpage the vector processors have a theoretical peak performance of 200 mflops each, giving our 8 CPU system 1.6 gflops. But, your mileage may vary depending on the code that is running. One of the standard benchmarks used for supercomputers is LINPACK. Results for a J916 with the same configuration as ours are listed in Performance of Various Computers Using Standard Linear Equations Software by Jack J. Dongarra from June 1995. An 8 CPU system was measured at 1.436 gflops, an efficiency of about 90% of the theoretical peak.
Next we'll run Linpack for Android. My Samsung Galaxy Note II has a 1.6 GHz ARM Cortex-A9 with four cores. Running LINPACK multi-threaded gives about 200 mflops, just a little faster than a single J90 processor. So, yes, that is (nearly) a mid-1990s entry level supercomputer in my pocket. At least on paper. We're really just exercising the ability to do floating point calculations, and this is not necessarily a good measure of system throughput on a real problem.
I estimate that the theoretical peak performance of the ARM is about 3 gflops or so, giving well below 10% efficiency. (I'm ignoring the GPU as I have no way to run LINPACK on it to benchmark it.) I should mention that the Android version of LINPACK is based on this Java Version and the low efficiency is in part due to the Java Virtual Machine.
But, overall, the Cray system with a 100 MHz clock speed has roughly 7.5 times the performance of an Android running at 1.6 GHz.
According to an archived copy of the Cray J90 Series webpage the vector processors have a theoretical peak performance of 200 mflops each, giving our 8 CPU system 1.6 gflops. But, your mileage may vary depending on the code that is running. One of the standard benchmarks used for supercomputers is LINPACK. Results for a J916 with the same configuration as ours are listed in Performance of Various Computers Using Standard Linear Equations Software by Jack J. Dongarra from June 1995. An 8 CPU system was measured at 1.436 gflops, an efficiency of about 90% of the theoretical peak.
"Just right for you" - it certainly is for us... |
Next we'll run Linpack for Android. My Samsung Galaxy Note II has a 1.6 GHz ARM Cortex-A9 with four cores. Running LINPACK multi-threaded gives about 200 mflops, just a little faster than a single J90 processor. So, yes, that is (nearly) a mid-1990s entry level supercomputer in my pocket. At least on paper. We're really just exercising the ability to do floating point calculations, and this is not necessarily a good measure of system throughput on a real problem.
I estimate that the theoretical peak performance of the ARM is about 3 gflops or so, giving well below 10% efficiency. (I'm ignoring the GPU as I have no way to run LINPACK on it to benchmark it.) I should mention that the Android version of LINPACK is based on this Java Version and the low efficiency is in part due to the Java Virtual Machine.
But, overall, the Cray system with a 100 MHz clock speed has roughly 7.5 times the performance of an Android running at 1.6 GHz.
Tuesday, July 15, 2014
bootp
We are at the point where we can power up the Cray and begin configuring it. Next we need to work on the J90 System Console (or SWS, the Service WorkStation.) This is used to net boot the I/O Processors in the I/O Subsystem, which in turn loads UNICOS into the J90 main memory. The SWS is a Sun SPARCstation 5 running Solaris. We didn't get
the original SS5 that came with the Cray so we had to build one.
We had a couple of these lying around, but they weren't in very good shape. We cobbled together a system from the parts which is better configured than what would have been used when the J90 was installed in 1996. This system has the maximum of 256 MB of RAM and a 24 bit S24 TCX graphics card.
There are two Ethernet interfaces. The one on the SBus card is 10BASE2, more informally known as ThinWire. This is only used to connect the SWS to the two I/O Processors in the I/O Subsystem. This allows the IOPs to net boot from the SWS and is also used to configure and manage the system.
The SWS with ThinWire Ethernet, FDDI, and graphics. |
There are two Ethernet interfaces. The one on the SBus card is 10BASE2, more informally known as ThinWire. This is only used to connect the SWS to the two I/O Processors in the I/O Subsystem. This allows the IOPs to net boot from the SWS and is also used to configure and manage the system.
Monday, July 14, 2014
This is Arecibo Calling...
"Ironically, the globular cluster at which the signal was aimed won't be there when the message arrives. It will have moved well out of the way in the normal rotation of the galaxy." - It's the 25th anniversary of Earth's first attempt to phone E.T. Cornell Chronicle, Nov. 12, 1999
My estimate of the Arecibo transmission from a couple of years ago. It looks like we may have missed... |
In 1974 a ceremony was held to dedicate a major upgrade to the radio telescope at Arecibo Observatory. As part of the festivities the telescope was used to transmit a message towards Messier 13, the Great Globular Cluster of stars in the constellation Hercules.
"Scientists Hope to Reach Hypothetical Civilization in a Cluster of Stars" - New York Times, Nov. 20, 1974
Sunday, July 13, 2014
System Ready
In a previous post I described the power requirements of the Cray J916 and the importance of the Central Control Unit (CCU) monitoring for faults. This is rather critical as the machine won't function properly unless these hardware status signals check out.
With the system re-cabled we can power it up and begin testing the hardware. We noticed that the CCU contains rechargeable D size batteries. Yes, we received a donated Cray and batteries were included! They were, of course, very dead. The system had been unplugged for a long time.
The back of the I/O Subsystem in the Peripheral Cabinet. This is the cleanest cable management system I've ever seen, but tracing cable paths while re-wiring the system was time consuming. |
Saturday, July 12, 2014
Half a Million Years of U.S. History
"Gutzon Borglum's design intentionally left three extra inches of granite on the surface of the sculpture so that nature, in the form of wind and water erosion, would finish carving Mount Rushmore for him over the next 20,000 years." - Matthew Buckingham in Cabinet Magazine
Workmen on face of Geo. Washington, Mt. Rushmore Source: Library of Congress |
I've stumbled upon claims similar to this a number of times. The amount of stone and the number of years varies. But, I've never seen a footnote or reference to a source that confirms that the meme is true. I found the above quote about 5 months ago in a post titled "Half a Million Years of U.S. History" on the Long Now Foundation Blog. My curiosity compelled me to spend some time digging for an answer. I wrote a reply in the comments section of the blog which was never published. Perhaps I should have waited a bit longer, given the nature of the Long Now project. I'm impatient... So, I'm going to publish what I found here:
Friday, July 11, 2014
Be a Computer Crusader!
The Retro-Computing Society of RI will be featuring the Cray Jedi at the next open house on Saturday July 19th at our facility in Providence. This fall is our 20th anniversary. (If you are looking for gift ideas we prefer the modern platinum to the more traditional china.)
A short time after our group formed we made arrangements for a private tour of The Computer Museum in Boston. Our member Carl has a detailed write-up of the visit and the collection that we saw. Not long after our 1996 visit TCM closed and the collection was moved to a new home on the west coast which became the Computer History Museum.
Another of our members created The Retro-Computing Lab Millspace Tour which gives a nice snapshot of what we were doing between late 1996 when we moved into the Eagle Street facility and mid-1998 when the page was last updated. Warning:
A short time after our group formed we made arrangements for a private tour of The Computer Museum in Boston. Our member Carl has a detailed write-up of the visit and the collection that we saw. Not long after our 1996 visit TCM closed and the collection was moved to a new home on the west coast which became the Computer History Museum.
A Data General poster depicting the Nova and Eclipse minicomputers as superheros. From the collection of the Computer History Museum. |
Another of our members created The Retro-Computing Lab Millspace Tour which gives a nice snapshot of what we were doing between late 1996 when we moved into the Eagle Street facility and mid-1998 when the page was last updated. Warning:
This page may take some time to load if you are viewing graphics.
Thursday, July 10, 2014
Under the Covers
What's inside a Cray J90? I'm going to take a pause in the restoration to "pop the hood" and describe what's under the covers. Here we have the two cabinets bolted together with the doors and most of the panels removed. On the right is the Processing Cabinet and on the left is the Peripheral Cabinet.
The Processing Cabinet has a Central Control Unit at the top which has LEDs that display status and fault conditions. Below that is a plenum space for the cooling blower intake and dust filters. Next is the backplane (the Processor and Memory Modules are inserted from the back.) Under that is the blower which exhausts out the back. At the bottom are the 48 volt DC power supplies.
The Peripheral Cabinet is moderately configured with plenty of space for expansion. In the center is the I/O Subsystem VME Chassis which contains two VME backplanes. Each contains a SPARC single-board computer which functions as an I/O Processor (IOP.) The remaining VME slots contain the system interfaces including Ethernet, disk controllers, and the I/O Buffer Boards (IOBB) that transfer data from peripherals to the J90 processors and memory. At the bottom are two disk arrays.
The Cray J916 with the front doors and many of the filler panels removed. |
The Peripheral Cabinet is moderately configured with plenty of space for expansion. In the center is the I/O Subsystem VME Chassis which contains two VME backplanes. Each contains a SPARC single-board computer which functions as an I/O Processor (IOP.) The remaining VME slots contain the system interfaces including Ethernet, disk controllers, and the I/O Buffer Boards (IOBB) that transfer data from peripherals to the J90 processors and memory. At the bottom are two disk arrays.
Tuesday, July 8, 2014
Galaxy Collision
Two galaxies colliding and merging.
Generated using the simulation code GADGET-2 running on a small number of nodes on the HPC cluster at the Center for Computation & Visualization and rendered using IFrIT. The T= shows simulation time in billions of years.
The source code computes the gravitational forces between the ordinary matter within the galaxies and the dark matter halos surrounding them. The dark matter is not rendered in this visualization. The code was created in 2000 and last updated in 2005. It is optimized for massively parallel computers with distributed memory.
A somewhat better quality version can be viewed here.
Kilowatts, Control Cables, and Cooling
The Retro-Computing Society of RI is located in the Atlantic Mills in the Olneyville neighborhood of Providence. We have a 1,500 sq. ft. facility in a mixed-use complex that was originally a worsted mill during the industrial revolution. It suits our needs quite well with amenities like a loading dock and a freight elevator. As I described in the previous installment, the system weighs more than a half ton. It was surprisingly easy to move as it has well designed casters that allow it to be smoothly rolled around. Now that it has arrived we need a plan to plug it in. Our electrical panel was a bit under-powered for running the Cray.
The system requires three-wire, single-phase, at 240 volts. A fully configured Processor Cabinet could draw a maximum of 4,200 watts while a full Peripheral Cabinet could draw up to of 3,600 watts. Our J916 is a moderately configured system which is about one half full. There is plenty of space for expansion to add additional drive bays, for example. There is a helpful Electrical Requirements Worksheet in the Preparing for a System Installation manual. For this configuration we calculate that both cabinets draw a total of about 2,000 watts or so. The property manager scheduled an electrician to upgrade our service and install the receptacles. We also made other changes to make it easier to power up some of our other machines that we haven't been able to run recently. The work was completed on June 30, 2014.
The back of the J916 with the two cabinets bolted together. Each cabinet has an AC power entry box which we had removed to prevent damage to the power cord during the move. |
Monday, July 7, 2014
Customer Name (if not confidential)
It was about four years ago when I first learned that there was still a Cray in the Department of Physics at Brown University. The machine was in the same building as my office; for years I had no idea it was sitting idle just a few floors above me. My colleague Prof. Ian Dell'Antonio facilitated the donation to the Retro-Computing Society of RI. He uses the high energy theory cluster for research on observational cosmology and gravitational lensing.
In my previous installment I described the Theory Computing Cluster machine room where the Cray J916 had been used for high energy theoretical physics research at Brown University. It was installed in 1996. I'm not sure exactly when they stopped using it. I was told that it was difficult to program and had been unused for some time. Supercomputers tend to have a rather short shelf life.
We first moved the Cray to Brown's Science Center where it was on display for a few years. Before it could be safely moved I had to "split" the cabinets. There is one Peripheral Cabinet housing the I/O Subsystem and disk arrays. The second cabinet is the mainframe, or Processing Cabinet, which contains the CPUs and memory. The Cray documentation sometimes uses the archaic definition of mainframe to refer to the primary frame (or cabinet) that contains the central processor. The Oxford English Dictionary cites a usage from Honeywell in 1964 and gives this definition:
My other computer is a Cray... |
We first moved the Cray to Brown's Science Center where it was on display for a few years. Before it could be safely moved I had to "split" the cabinets. There is one Peripheral Cabinet housing the I/O Subsystem and disk arrays. The second cabinet is the mainframe, or Processing Cabinet, which contains the CPUs and memory. The Cray documentation sometimes uses the archaic definition of mainframe to refer to the primary frame (or cabinet) that contains the central processor. The Oxford English Dictionary cites a usage from Honeywell in 1964 and gives this definition:
mainframe, n.
2. Computing. Originally: the central processing unit and primary memory of a computer. Now usually: any large or general-purpose computer, esp. one supporting numerous peripherals or subordinate computers.
Sunday, July 6, 2014
The Theory Cluster
The use of high performance computers has had a tremendous impact on the progress of science. Theses machines have enabled us to advance our understanding of everything from elementary particles to the large scale structure of the universe. The fastest systems of any era are referred to as supercomputers. For many years supercomputing was synonymous with the machines designed by Seymour Cray at Control Data Corporation and later at Cray Research.
Supercomputers have always been very large and expensive. They require a large amount of electrical power and exotic cooling systems. They are typically a shared resource only used at large government research laboratories and academic institutions. By the late 1980s a new class of minisupercomputer was introduced. With a price starting at less than one million dollars these smaller air-cooled systems could exclusively be used by a research group or academic department.
In the mid 1990s the Brown University Department of Physics was the first physics department in the U.S. to acquire a Cray system. In Augusts 1995 a Cray EL98 was installed. This was followed in late 1996 with the installation of a Cray J916. They were used for high-energy and condensed matter theoretical physics. Details of the research are at the Computational High Energy Physics group page.
Supercomputers have always been very large and expensive. They require a large amount of electrical power and exotic cooling systems. They are typically a shared resource only used at large government research laboratories and academic institutions. By the late 1980s a new class of minisupercomputer was introduced. With a price starting at less than one million dollars these smaller air-cooled systems could exclusively be used by a research group or academic department.
In the mid 1990s the Brown University Department of Physics was the first physics department in the U.S. to acquire a Cray system. In Augusts 1995 a Cray EL98 was installed. This was followed in late 1996 with the installation of a Cray J916. They were used for high-energy and condensed matter theoretical physics. Details of the research are at the Computational High Energy Physics group page.
The Theory Cluster Machines webpage of the High Energy Physics Group at Brown University. The page was created in late 1995 and includes a publicity photo of the Cray EL98 that had just been installed. The snapshot was captured using NCSA X Mosaic on a SPARCstation 5 running Solaris. |
Tuesday, July 1, 2014
Shatter Cone
Shatter Cone from the Sudbury Basin. |
In 1992 I attended a conference on Large Meteorite Impacts and Planetary Evolution in Sudbury, Ontario. The highlight of the visit was the field trip that we took to view the geological evidence of a massive meteorite impact nearly two billion years ago. The impacting object was likely in the range of 10-15 km (about 1/2 - 1 mile) in diameter. It left an oval basin which is 60 by 30 km (40 by 20 miles) containing breccia, a rock formed from the broken fragments of pre-impact bedrock. There are also rocks that have been shattered by the shock wave from the impact. These are only found in impact craters and are called shatter cones.
Labels:
Astronomy,
Geology,
Natural History
Location:
Greater Sudbury, ON, Canada
Monday, March 10, 2014
The Transits of Venus
The 2012 transit of Venus by Michael Umbricht |
The telescope is a refractor with 12" aperture and 15' focal length. A solar pre-filter was used on the objective of the telescope and the camera used a broadband hydrogen alpha nebular filter to further reduce the brightness due to the sensitivity of the camera. The weather was mostly cloudy during the day and this image was taken during a brief thinning of the clouds which gives the Sun's surface a mottled appearance.
The photo at left shows Michael Umbricht preparing the telescope and camera before the transit.
Sunday, March 9, 2014
The Chemical Furnace
Celestial globe, late 19th century. |
Subscribe to:
Posts (Atom)