Charlie ("Charles N") Brown was the force behind Locus Magazine (previously) until his death in 2009; he hired me to be a columnist for the magazine in 2006 and I've been writing for them ever since. Read the rest
The Arch Mission Foundation is nano-etching 30,000,000 pages' worth of "archives of human history and civilization, covering all subjects, cultures, nations, languages, genres, and time periods" onto 25 DVD-sized, 40-micron-thick discs that will be deposited on the surface of the moon in 2019 by the Beresheet lander. Read the rest
University of North Carolina at Chapel Hill is using a grant to create kits for novice archivists to use in underserved communities. Dubbed Archivist in a Backpack, the kits actually range in size and scope, from backpacks loaded with recording equipment and guides to rolling suitcases with flatbed scanners. Read the rest
RESPONSIBILITIES:Actively work in the care, catalog, storage and preservation of all artifacts and archival materials; the care, cleaning, and monitoring of all exhibits. Maintain and Update the archival database system. Monitor the trafficking of archive inventory. Assist the appropriate staff in having access to the archives collection as required. Travel/act as a courier of artifacts to locations where artifacts are to be displayed including the setting up and taking down of exhibits in these locations. Execute, maintain, and provide accurate conditioning reports for all items being moved from storage for exhibition. Ensure that the collections manual, preservation plans and archives emergency plan are observed. Locate, retrieve, and prepare artifacts for display/loans. Ensure the integrity of the collection in maintained at all times. Oversea all cleaning of exhibit spaces. Work with outside vendors to schedule monthly, quarterly and annual cleaning. Assist with Archives long term planning, conservation goals and preservation needs. Photograph and or scan artifacts when needed. Assist with exhibition installs. Maintain displayed artifacts in proper environment, eliminate risk to artifacts. Assist Director of Archives with coordinating activities involving the maintenance, preservation and mansion upkeep. Ensure the integrity of the exhibitions are maintained at all times. Read the rest
At Vintage Cassettes, "you will find the beatiful pictures of sealed compact cassettes."
Cassettes from 1970-1990 are covered the most. Collecting vintage cassettes is a great hobby and brings all good memories back. Cassettes are organized by brands and then the years they were produced. We concentrate on the most important brands. This site try to cover three markets: US, Europe and Japan. We also added the History of Compact Cassettes located to the right.
When I was a kid I wondered if METAL meant that it was specially made for taping, like, Megadeth.
The companion site (with better images) would be The Tape Deck, which posts pictures of the cassettes themselves.
It's not for the public ("accessible in the Yale library"), but researchers are working on a "universal translator" for old computer files that might otherwise be lost to obsolescence. Jessica Leigh Hester, at Atlas Obscura:
When one CCA visitor wanted to take a look at a CD-ROM-based “multimedia website” produced in conjunction with a 1996 exhibition of work by the architect Benjamin Nicholson, Walsh needed to wind back the clock. He tracked down an old license for Windows NT and installed Netscape Navigator and an old version of Adobe Reader. This all enabled decades-old functionality on a two-year-old HP tower.
This strategy works, but it has drawbacks. “These environments are time-intensive to create, will only run on a local computer, and they typically require a lot of technical know-how to set up and use,” Walsh says. Ad hoc emulation is not for the novice or the busy.
Researchers at Yale are working to solve this problem by creating a kind of digital Rosetta Stone, a universal translator, through an emulation infrastructure that will live online. “A few clicks in your web browser will allow users to open files containing data that would otherwise be lost or corrupted,” said Cochrane, who is now the library’s digital preservation manager. “You’re removing the physical element of it,” says Seth Anderson, the library’s software preservation manager. “It’s a virtual computer running on a server, so it’s not tethered to a desktop.”
In the early days of TV, it was routine to tape over the recording medium after the initial air-date, which means that no video record exists of many of the pioneering moments in television. Read the rest
Science Friday's beautiful "File Not Found" series looks at the thorny questions of digital preservation: finding surviving copies of data, preserving the media it is recorded upon, finding working equipment to read that media, finding working software to decode the information once it's read, clearing the rights to archive it, and maintaining safe, long term archives -- all while being mindful of privacy and other equities. Read the rest
Compuserve's sprawling, paleolithic forums were acquired along with Compuserve itself by AOL in 1998, and their fossil remains were augmented, year after year, decade after decade, by die-hard users who continued to participate there. Read the rest
Robots (or spiders, or crawlers) are little computer programs that search engines use to scan and index websites. Robots.txt is a little file placed on webservers to tell search engines what they should and shouldn't index. The Internet Archive isn't a search engine, but has historically obeyed exclusion requests from robots.txt files. But it's changing its mind, because robots.txt is almost always crafted with search engines in mind and rarely reflects the intentions of domain owners when it comes to archiving.
Read the rest
Over time we have observed that the robots.txt files that are geared toward search engine crawlers do not necessarily serve our archival purposes. Internet Archive’s goal is to create complete “snapshots” of web pages, including the duplicate content and the large versions of files. We have also seen an upsurge of the use of robots.txt files to remove entire domains from search engines when they transition from a live web site into a parked domain, which has historically also removed the entire domain from view in the Wayback Machine. In other words, a site goes out of business and then the parked domain is “blocked” from search engines and no one can look at the history of that site in the Wayback Machine anymore. We receive inquiries and complaints on these “disappeared” sites almost daily.
A few months ago we stopped referring to robots.txt files on U.S. government and military web sites for both crawling and displaying web pages (though we respond to removal requests sent to email@example.com). As we have moved towards broader access it has not caused problems, which we take as a good sign.
Brewster Kahle, who invented the first two search engines and went on to found and run the Internet Archive has published an open letter describing the problems that the W3C's move to standardize DRM for the web without protecting otherwise legal acts, like archiving, will hurt the open web. Read the rest
A ruling about a DC university held that posting course videos to the open web without subtitling them violated the Americans With Disabilities Act (while keeping them private to students did not) (I know: weird), and this prompted UC Berkeley to announce the impending removal of 20,000 open courseware videos from Youtube. Read the rest