Programmer demonstrates his "perfect" Minesweeper AI

Code Bullet claims in this demo video, "I was able to create what I believe to be a perfect minesweeper player." Read the rest

iPad review video made on a iPad

Serenity Caldwell made a video about the iPad using her 2018 iPad and an Apple Pencil. Now I feel guilty for using my iPad mainly as a Netflix streamer.

[via Doobybrain] Read the rest

Review: Zotac's Zbox EN1070K is the tiny game PC that could

Regular readers will know I'm fond of tiny computers. During my search for one powerful enough to play games on, I found several beautiful and well-made options. But none were so wee as the Zotac Zbox EN1070K [Amazon], which is roughly the size of a Sega Dreamcast. I've had it for six months, now, and can report that it's great: easily the most enjoyable, compact, no-nonsense game-ready PC I've ever owned.

Miniaturization is accomplished by using the MXM video card form factor originally devised for laptops. In the past, this would have resulted in a severe performance compromise. But current Nvidia models hit close to the numbers posted by full-size counterparts. Even with Zotac slightly underclocking the GTX 1070 (presumably for heat reasons), it benchmarks close enough to the full-size model that I doubt I could tell the difference side-by-side.

There's even a model with the GTX 1080 [Amazon] in it, but it's twice the size of this one and I wanted small, and it turns out the 1070 is more than enough for every game I've tried, outpacing the GTX 970-equipped PC I upgraded from. The latest games on the highest settings on 4k monitors would be pushing it, I'm sure, but if you need that, maybe a PC the size of a hardback novel isn't in your future.

There are compromises to bear in mind. Upgrading the i5 Kaby Lake CPU is possible, but I won't be chancing it for a long time -- it voids the warranty and requires almost complete disassembly. Read the rest

iMac Pro reviewed

TechCrunch's Veanne Cao reviews Apple's iMac Pro. It's a beautiful, powerful machine, Veanne writes, but when it comes to high-end video work the price premium over a similarly-specced Windows box makes it a hard sell.

There’s a period of zen we reach as editors when we’re plowing through an edit, when we’re so consumed by whatever project we’re working on that hours will pass before we realize we’ve forgotten to eat, sleep, pee. ... With the iMac Pro, I’m reminded of how enjoyable video editing can be.

But...

I definitely can’t justify its price tag to my corporate overlords. My two friends who run production companies with teams of 14 and 28 echoed the same sentiment: “It doesn’t make sense, business-wise, with that many employees.” And my freelance colleagues, even the ones consistently landing high-paying gigs, all but one said it wasn’t worth the price, “I’d rather spend the extra few thou on lenses or a new body.”

I would still buy it if I were doing lots of high-end pro work. Why? Because Windows is hinky.

It's not a platform for taking pleasure in one's work, unless you're lucky enough to be working in a field that requires only one particular well-made app to get it done. Windows is a platform for disinterested drudgery and games. Just last week, Microsoft pushed out a "Windows Ink" update that broke my Wacom gear, with no obvious or easy workaround until Wacom published a hacky command-line fix. Mac OS is far from perfect, but at least it doesn't force on me Microsoft's drivers for its own comically low-end tablet PCs. Read the rest

Confession: You can't trust a junkie with a new laptop

There's still plenty of life left in my 2015 MacBook Pro. But sooner or later, I'll ditch my computer in favor something new.

The nerd in me is wicked excited with the notion of using an ultra light laptop with an external graphics processor, for several reasons. I've always wanted to own a gaming laptop, but I could never justify the price, or the weight of one in my bag. Going with a computer that can connect to an external GPU means that I could invest in the laptop first, and then the GPU when I could afford it. And since the GPU for the rig is external, I wouldn't be forced to carry around a heavy bastard of a computer with me every time I needed to take off on assignment. That said, I was hesitant to buy one without seeing how it'd perform, first and foremost, as a work machine. I really like the look of the Razer Blade Stealth: the laptop's industrial design is what Apple might have come up with if their design department had a shred of edge or attitude. So, relying on the privilege of my position as a tech journalist, I asked Razer if I could borrow one.

They said yes.

I spent the past month working on Razer's insanely well-built ultrabook. It was pimped out with 16GB of dual channel RAM, and an Intel Core i7 2.70Ghz processor. It's zippy! But then, that's in comparison to my daily driver: a three year old Core i5 with 8GB of RAM. Read the rest

Watch this bizarre Komputer Tutor supercut of the phrase "floppy diskette"

An absurd and wonderful example of semantic satiation, starring the "Komputer Tutor" Kim Komando, best known for her bestselling 1990s instructional videos sold via infomericial. And in case you were wondering, Kim Komando is still at it!

Read the rest

NASA's got a computer model for predicting landslides

Landslides are bad news. In parts of the world where heavy, sustained rains can rapidly give way to flash flooding, they're responsible for tragic loses of life, property and transportation infrastructure. That the latter can wind up under hundreds of tons of mud and debris makes it far more difficult for first responders to do anything about the former--if you can get to people, you can't save them. Since we can't change the weather, we can't stop landslides. But NASA's churned out new tech that could make the difference between an evacuation and a recovery effort.

According to Space.com, NASA's got a hot new computer model designed to identify landslide hazards around the world, every 30 minutes:

Heavy, sustained rainfall is a key trigger of landslides around the globe. So Kirschbaum and co-author Thomas Stanley, a landslide expert with the Universities Space Research Association at NASA Goddard, built the new model using rainfall data gathered by the Global Precipitation Measurement satellite mission, which is run jointly by NASA and the Japan Aerospace Exploration Agency.

The model also employs a "susceptibility map" to determine if areas getting hammered by rain are particularly landslide-prone — for example, if they lie on or near steep slopes and/or tectonic-plate boundaries, or have been subject to significant deforestation.

High-risk areas are identified in "nowcasts," which the new open-source model produces every 30 minutes.

Given the number of lives per year that this computer model's predictions could save, to call this news huge would be an understatement. Read the rest

Meet vintage videogaming's archivist extraordinaire

Back in 2012, we published a feature about Frank Cifaldi, one of the world's leading collectors of rare vintage videogames and related ephemera. Since then, Cifaldi founded the Video Game History Foundation, dedicated to preserving this vibrant art form's history and culture for the ages.

(Vice) Read the rest

Beautiful photos of beautiful vintage computers

I think all modern computers should look like these vintage ones photographed by James Ball a couple of years ago. [via Wired] Read the rest

Enhance your ZX Spectrum with this glorious backlit keyboard

ZNRenew enhances your old Sinclair personal computer with beautiful colored cases and, soon, a striking backlit version of its infamous rubber chicklet keyboard.

Read the rest

Samia Halaby is an 81-year-old Commodore Amiga artist

"Use a material for what it's capable of doing," Samia Halaby says. "You don't make something out of wood that should be made out of Iron."

She's not dinking around in Duluxe Paint either like that hamfisted hack from Pittsburgh did back in the 80s. Halaby is coding generative, animated art in AmigaBasic!

The Guru Meditation: "Samia Halaby is a world renowned painter who purchased a Commodore Amiga 1000 in 1985 at the tender age of 50 years old. She taught herself the BASIC and C programming languages to create "kinetic paintings" with the Amiga and has been using the Amiga ever since. Samia has exhibited in prestigious venues such as The Guggenheim Museum, The British Museum, Lincoln Center, The Chicago Institute of Art, Arab World Institute, Mathaf: Arab Museum of Modern Art, Sakakini Art Center, and Ayyam Gallery just to name a few."

Read the rest

Cool linux handheld computers coming in 2018

Giant Pockets rounds up the options for owning an ultraportable, ultra-light computer that run an easily-accessible distribution: The World of Linux Handhelds in 2018. Thanks for Android, this is more niche than it ever was, but there's a surprisingly large number of options either already out or coming soon. Freedom is fun! So are games...
I have somewhat mixed feelings about where these devices go. First, I am really glad to see a lot more activity and official support of Linux on more and more hardware, because more choice means more competition and likely better options altogether for several types of users. However, I am quite concerned with the overall trend to make such devices more and more premium, price-wise.
The Gemini is very my cup of tea, and should ship imminently. Read the rest

Noodle Pi is a powerful 3D-printed pocket computer

Noodle Pi is described as "the smallest, lightest, most open handheld / wearable computer," a Raspberry Pi tightly packaged with a high-resolution multi-touch screen, battery and camera in a compact 3D-printed all-weather case. About the size of a large smartphone but much thicker and more versatile, the Noodle Pi has a bunch of accessories to go with it including a keyboard/touchpad dock and a "Noodlendo" clip to attach it to a classic game controller.

Noodle Pi uses the recently released Pimoroni HyperPixel 3.5" display. This is a high speed, high resolution (800x480 pixels at ~270 PPI) touchscreen display with 18-bit color (262,144 colors) and a 60 FPS frame rate.

Noodle Pi also integrates the Raspberry Pi Camera Module v2, for up to 8MP still photos, and 1080p30 / 720p60 / VGA90 video.

Noodle Pi is powered by an internal 500mAh battery. It can be charged via a regular micro-USB charging socket, and there's a red LED to provide a low battery warning.

You can buy it for $200, or as a bag of parts (without electrics) for $50. Read the rest

Make your own ALTAIR computer based on an Arduino

Designed by Ed Roberts and released in 1974, the MITS ALTAIR sparked the personal computer revolution and was the basis for Microsoft's first product, the Altair BASIC interpreter. It cost $439. While ALTAIR replica kits and online emulators have been available for years, there's now a $149 kit that substitutes the ALTAIR guts with an Arduino Due while duplicating its iconic LED-laden case. From Stephen Cass's review of the Altairduino at IEEE Spectrum:

The Altairduino is derived from David Hansel’s work on cloning the Altair with the Arduino Due and Arduino Mega 2560. If you want to build one of Hansel’s designs from scratch, you can do so by following his free instructions on hackster.io. The advantage of Davis’s kit is that it provides all the components, including a nice bamboo case and plastic front panel, along with a custom printed circuit board (PCB) that greatly simplifies construction.

The Altairduino improves on the original Altair in two important respects. First, it offers modern interface options. You can connect an old-school terminal using an optional DB-9 connector (which I will stipulate should properly be called a DE-9 connector, so no need to send me letters this time!), but you can also use a soft terminal running on a computer via a USB connection, or even Bluetooth....

The second big improvement is that the Altairduino comes loaded with a lot of software. You can call up some programs purely by flipping various front panel switches, such as Kill the Bit, a game that hacked the Altair’s memory-address indicator lights to act as a 1-dimensional display.

Read the rest

ParadiseOS far from it

ParadiseOS depicts an alternative computing world from the turn of the millenium: a desktop obscenely slathered in compulsory and broken services, ads and applications, an experience designed by dotcom era advertising boyars but hopelessly unrealistic before the wide availability of broadband internet and hardware video decoding. It's part Black Mirror, part vaporwave, part ironically brilliant web development by Stephen Kistner.

Paradise OS imagines an alternate version of 1999 where the personal computer is a gateway to a commercialized global network. Palm Industries, a former mall developer turned technology giant effectively controls all online experiences.

Acting as a time capsule, the desktop captures the moments of December 30, 1999 — just days before a catastrophic Y2K event leads to the computer emerging in our dimension. Participants explore this frozen moment from time, using the content to discover more about the world from which it came.

The project references the visual vernacular of the 20th century American shopping mall. It establishes a connection between the mall and the Internet as escapist experiences and hubs of social activity.

The desktop's content deals with Internet phenomena including fake news, instant gratification and information overload. By engaging with contemporary topics from the perspective of an alternate reality, the project encourages participants to think more critically about the state of our own digital spaces.

Read the rest

Voxel computer art

Ciara Burkett is making wonderful voxel computers and such; bookmark their Ello page for more. [via Jay Allen]

Above, an Apple ⫻. Here's a Compaq portable: Read the rest

Men have been pushing women out of tech since the beginning

Programming was women's work: the six who ran Eniac, America's first digital computer, were women. But not for long.

They were systematically pushed out of the field, says technology historian Marie Hicks, assistant professor at the University of Wisconsin-Madison, who wrote about it in her recent book, “Programmed Inequality (Amazon).”

Sexism was so extreme in the UK that it played a significant part in the collapse of its first domestic computer industry in the 1960s, writes the WSJ's Christopher Mims:

Not only were the male recruits often less qualified, they frequently left the field because they viewed it as an unmanly profession. A shortage of programmers forced the U.K. government to consolidate its computers in a handful of centers with the remaining coders. It also meant the government demanded gigantic mainframes and ignored more distributed systems of midsize and mini computers, which had become more common by the 1960s

In 1984, 37% of computer science degrees were awarded to women, but it's been in decline ever since. Women are leaving the industry in increasing numbers, "despite" its "diversity and inclusion efforts."

If a firm has hired its first 10 employees and they are all the same gender or ethnicity, an eleventh who doesn’t look like the rest can face challenges.

The First Women in Tech Didn’t Leave—Men Pushed Them Out [WSJ] Read the rest

More posts