Edenwaith Blog
8th January 2019 | Programming
Over the past several years, virtual reality (VR) has finally been making some solid inroads towards being a viable commercial technology. Playstation VR seems to be one of the strongest contenders, primarily due to the ubiquitousness of the Playstation 4 and its lower system specs versus high-powered PCs which are needed to run other VR platforms (e.g. Oculus Rift, HTC Vive). The PC market (Windows, Linux, Mac) has the Oculus Rift and HTC Vive as the primary VR products. Xbox does not officially support any VR solutions, even though there are a number of headsets for that platform for varying levels of support. On Android, Google Cardboard, Google Daydream, and Samsung VR (only supported on Samsung's flagship phones) are the big names. But what about iOS?
VR on iOS
Apple took a strong step into the augmented reality (AR) field in 2017 with their introduction of the ARKit framework, which made it much simpler to add AR features to iOS apps. Unfortunately, there is no native Apple-branded VR framework (e.g. "VRKit") at this time. Without strong support from Apple to help define the VR landscape and requirements for its ecosystems, this will result in a bunch of mostly unknown bit-players introducing half-baked products in an effort to enter an emerging market.
Fortunately, the most prominent player for mobile VR is undoubtedly Google with their offerings of Google Cardboard and Google Daydream. Daydream is only available for Android at this time, but Google Cardboard (and the many Cardboard-compatible viewers by other manufacturers) work with both iOS and Android. In addition to the specifications for the construction of the headset, Google also provides a VR SDK for Android and iOS.
I experimented with the Cardboard-compatible Utopia 360° Immersive Bundle which included Bluetooth headphones and a controller, in addition to the headset. The headset by itself is useful for 360° panoramas and immersive videos. I tried a rollercoaster VR app which was interesting to watch, but it gave me motion sickness after just several minutes. The included instructions warn the user to take a 10-15 minute break every half hour of use to prevent the adverse effects of VR such as eye fatigue and motion sickness.
When paired with a controller, VR can provide a new way to reimagine older products, such as the VR adaptation of the classic arcade game XEVIOUS. However, by requiring additional accessories to properly interact with VR limits which apps can be used. The Cardboard specifications provide for a single button on the headset which allows for very limited physical engagement with the phone. For some apps, they need to be manually set up on the phone first, then the phone can be placed into the headset to begin the VR experience. These cases result in awkwardness when interacting with the device. Since a dedicated controller is not guaranteed with all VR kits, this can limit the usefulness and functionality of the current batch of apps. Even the Utopia 360° line of VR products is not consistent since some kits only provide the headset and others may provide additional accessories such as the controller or earbuds.
Without a more "official" solution (such as the Playstation VR), the experience, especially with controls, is limited and inconsistent. This does not establish a good set of guidelines of what should constitute good a VR experience.
Google kicked things off several years ago with Cardboard, but there has been little progress since then, and Apple has been noticeably absent from the VR scene so far. VR for mobile at this time is more of a fun curiosity, but it is lacking the proper dedicated full-time support from the first parties to make it more of a viable reality.
References
5th January 2019 | Edenwaith
2018 was a continuation of wrapping up the big projects which were being worked on in 2017. As a result, several major projects were completed in the first half of 2018, including a complete rewrite of the Edenwaith website which fully adopted HTML5, CSS3, and responsive web design.
- Edenwaith Website 4.0
- Edenwaith Blog 5.0
- EdenList for iOS 2.0
- Permanent Eraser 2.8
With the major projects being wrapped up early in the year, this left a lot of open time to pursue a number of new projects, which resulted in twenty blog posts being written over the year. This far eclipsed the amount of posts I have written in previous years, the number which often could be counted on just a single hand.
Notable blog posts:
The latter half of 2018 was heavily spent investigating how Sierra's AGI game engine works, and I will continue these explorations and I am currently in the process of porting QT AGI Studio from Linux to Mac. Existing projects such as EdenList and Permanent Eraser will also continue to be developed. As Apple has warned for well over a year, they will be dropping support for 32-bit apps and frameworks with macOS 10.15, so the current version of 33 RPM will likely not work on future versions of macOS.
- EdenList 2.0.1+
- Continuing work on Permanent Eraser 3.0
- QT AGI Studio for Mac
- Potential new project(s)
29th November 2018 | Programming
On a yearly basis, Apple allows the developer to reset their list of provisioned test devices so they can delete, rename, or add devices. There are times where Apple requires the developer to reset this list. Fortunately, resetting the list allows them to quickly keep or delete any of the existing devices. If the developer wants to rename a device, the existing device will need to be deleted and then re-added with the new name.
However, let's say you want to get a complete list of all of your devices and their associated Unique Device Identifier (UDID). The following bit of JavaScript code shows how a person can generate such a list, which can be saved to a text file or used to easily upload multiple devices to the Apple Developer Portal. By using a bit of JavaScript, you can generate a list with all of your registered devices and their UDIDs.
Log in to your Apple developer account and then go to the Certificates, Identifiers & Profiles section. Click on the All link in the Devices section which will list all of your provisioned test devices. You will then run the following code snippet in the JavaScript console of your web browser.
5 December 2019 Update: Apple's website has changed again, so here is the updated code needed to create a list of Device IDs and Names.
var data = document.querySelectorAll(".infinite-scroll-component .row");
var deviceListString = "Device ID\tDevice Name\tDevice Platform\n"
for (var i = 1; i < data.length; i++) {
deviceListString += (data[i].childNodes[1].innerText + "\t" + data[i].childNodes[0].innerText + "\t" + (data[i].childNodes[2].innerText == "iPhone" || data[i].childNodes[2].innerText == "iPad" ? "ios" : "mac") + "\n");
}
console.log(deviceListString);
In Safari, open up the JavaScript Console from the Develop > Show JavaScript Console
menu. If the Develop
menu is not available, go to the Advanced tab in Preferences and select the Show Develop menu in menu bar checkbox.
In Google Chrome, open up the JavaScript Console from the View > Developer > JavaScript Console
menu.
The script will then produce a formatted list of your device IDs and device names. This script works as of this writing in November 2018, but it could easily break if Apple alters the web page.
If you need to upload a bunch of devices, you can take the list generated from the script and save to a file in the following format:
Device ID Device Name
A123456789012345678901234567890123456789 NAME1
B123456789012345678901234567890123456789 NAME2
If you want to rename, remove or add a device, this is the place to do so. Once complete, this file can then be imported when adding a new device on the Apple Developer Portal.
References
25th November 2018 | Permanent Eraser
With each new iteration of macOS, Apple continues to strengthen its security measures. While this is essentially a good thing, the additional restrictions result in some frustration and confusion to allow users to continue to work like normal.
One of the newest restrictions introduced in macOS Mojave now requires explicit permission from the user to use third party Automator actions, which is what Permanent Eraser 2 uses for its Erase service. Upon using the Erase service from a contextual menu in Mojave's Finder, the user will see the following prompt:
Click on the Open Automator button, which will launch that application. Next, select the Automator > Third Party Automator Actions... menu.
In the Third Party Automator Actions window which appears, select the Enable Automator actions from third parties checkbox and click the OK button.
You are now set up to use third party Automator actions again.
20th October 2018 | Programming
This is the third part of a continuing series of articles as I explore the inner workings of Sierra's Adventure Game Interpreter (AGI). This article will focus on the use of colors in these 80s games.
When I was working on my example to export the views from an AGI game, I noticed that there seemed to be some inconsistencies in the color palette values I had originally used from another code example. The original color palette had appeared to vary slightly from the standard CGA palette, but after more carefully inspecting the available colors in several AGI games, I noticed that it was indeed using the CGA colors, and not some variant palette.
The following table lists the CGA color palette, which consists of four highlights and twelve colors formed from a mixture of red, green, and blue components. Note how each of the RGB elements increments by a perfect third. In the 8-Bit Guy's video
Modding a consumer TV to use RGB input, he explains how with a digital RGB signal, there are only eight colors possible using three bits (23 = 8), but an intensity signal effectively doubles the number of colors, which is what we see with the CGA color palette where most of the colors have a light and dark variant, the exception being for the brown color.
Full CGA 16-Color Palette |
0 |
Black
(0, 0, 0)
#000000
|
8 |
Grey
(85, 85, 85)
#555555
|
1 |
Blue
(0, 0, 170)
#0000AA
|
9 |
Light Blue
(85, 85, 255)
#5555FF
|
2 |
Green
(0, 170, 0)
#00AA00
|
10 |
Light Green
(85, 255, 85)
#55FF55
|
3 |
Cyan
(0, 170, 170)
#00AAAA
|
11 |
Light Cyan
(85, 255, 255)
#55FFFF
|
4 |
Red
(170, 0, 0)
#AA0000
|
12 |
Light Red
(255, 85, 85)
#FF5555
|
5 |
Magenta
(170, 0, 170)
#AA00AA
|
13 |
Light Magenta
(255, 85, 255)
#FF55FF
|
6 |
Brown
(170, 85, 0)
#AA5500
|
14 |
Yellow
(255, 255, 85)
#FFFF55
|
7 |
Light Grey
(170, 170, 170)
#AAAAAA
|
15 |
White
(255, 255, 255)
#FFFFFF
|
Since the CRT monitors of the era were based off of RGB, instead of the standard primary colors of red, blue, and yellow, so the available colors are more of a mix of red, green, blue, cyan, magenta, and yellow along with several levels of highlights from black to white.
Most of the standard colors are represented with this color palette, with the notable exceptions of orange and purple, two of the three secondary colors on a standard RYB color wheel, which are absent. These two colors are simulated with the light red and magenta colors.
In the AGI version of Mixed-Up Mother Goose, the giant pumpkin (which houses Peter Peter Pumpkin Eater and his estranged wife) was more of a salmon color, since a true orange was not available. Of interest, in the SCI remake of Mixed-Up Mother Goose, the pumpkin looks much closer to orange, but upon further inspection, one will find that it still maintains the similar salmon color, but the higher screen resolution and dithering effect trick the eyes into perceiving that the pumpkin is closer to orange. It's interesting to see how doubling the screen resolution, yet keeping the same color palette gives the illusion of more colors.
Also of interest is how the colors might even vary slightly from one system to another. The screenshot at the beginning of this post is from King's Quest on an Apple ][, where the colors are slightly different from a PC, plus there are some artifacts around the edges of objects where the color bleeds. However, if the screen uses a mono tint, the images and text are a little sharper.
When the Mac version of King's Quest 2 is run through ScummVM, the green is a little more fluorescent in appearance when compared to the DOS version of the game. Whereas the PC uses the traditional #55FF55
color for the light green, the Mac uses a more vibrant #00FF00
, which is in line with many of the Mac colors which tend to be a little brighter.
Apple Macintosh Default 16-Color Palette |
0 |
White
(255, 255, 255)
#FFFFFF
|
8 |
Green
(31, 183, 20)
#1FB714
|
1 |
Yellow
(251, 243, 5)
#FBF305
|
9 |
Dark Green
(0, 100, 18)
#006412
|
2 |
Orange
(255, 100, 3)
#FF6403
|
10 |
Brown
(86, 44, 5)
#562C05
|
3 |
Red
(221, 9, 7)
#DD0907
|
11 |
Tan
(144, 113, 58)
#90713A
|
4 |
Magenta
(242, 8, 132)
#F20884
|
12 |
Light Grey
(192, 192, 192)
#C0C0C0
|
5 |
Purple
(71, 0, 165)
#4700A5
|
13 |
Medium Grey
(128, 128, 128)
#808080
|
6 |
Blue
(0, 0, 211)
#0000D3
|
14 |
Dark Grey
(64, 64, 64)
#404040
|
7 |
Cyan
(2, 171, 234)
#02ABEA
|
15 |
Black
(0, 0, 0)
#000000
|
As can be seen by comparing the Macintosh and CGA color palettes, they hold many similarities, but the Macintosh palette uses better representations of orange and purple.
This article was originally intended as an addendum to a previous post, but it became far more involved as I further explored how AGI and various computing platforms presented color.
AGI Color Palette
[5 April 2019 Update] I have been experimenting with a fair bit of AGI-style art lately. It was becoming annoying having to continually set the colors I wanted, so I created this custom AGI color palette for macOS. Download the file, unzip it, place the AGI Color Palette.clr
file into your ~/Library/Colors
folder, and then the AGI Color Palette will be available in the standard color picker under the Color Palettes tab.
References
7th October 2018 | Programming
With the advent of CD-ROM technology, the size of computer games in the 1990s were no longer constrained to a couple of megabytes delivered on a handful of floppy disks. Now, with several hundred megabytes available, improved graphics and audio were available, which included full voice acting.
However, the computers of the early half of the 90s still had relatively small hard drives, often smaller than what a CD contained. This led to game installations where the primary files were installed onto the computer's hard drive, but the bulk of the audio remained on the CD.
Moore's Law continued unabated for a number of years, and the features and capabilities of PCs increased dramatically. In comparison to more modern systems, optical drives are the bottleneck which slows down games by waiting for the CD to spin up to play an audio track. Even more telling with the progress of technology, optical drives are a rarity these days. On my main computer (an iMac) I have an external optical drive so I can still install games I purchased back in the 90s, but I'd prefer to not have to connect up the drive and insert a CD if I wanted to play a particular game. This post will detail how to install the Sierra On-Line game The Dagger of Amon Ra and configure it in DOSBox so the CD is not required when playing the game.
My copy of The Dagger of Amon Ra comes from the King's Quest Collection Series which contained the first seven King's Quest games, several early Sierra Apple ][ games, plus the two Laura Bow games — Colonel's Bequest and The Dagger of Amon Ra. If you have this game on an original game CD or are using a non-Mac system, adjust the instructions as necessary.
In DOSBox, mount the appropriate CD and install the game using the following commands:
mount d /Volumes/KQ_CD3 -t cdrom -usecd 0
D:
cd LB2
install
Next, copy the file RESOURCE.AUD
from the CD into the LB2 folder on your computer. This should be a 355MB file. Then, a couple of edits need to be made to the RESOURCE.CFG
file so the game will search for the extra audio files on the computer and not on the CD. For Amon Ra, the value for audioSize
needs to be set to 32k, otherwise the speech will result in odd beeps, scratching noises, or cause the game to freeze up. Remove the audio
key-value pair and replace it with resAUD
and resSFX
and their associated values. The following is how I configured my RESOURCE.CFG
for The Dagger of Amon Ra.
videoDrv = VGA320.DRV
soundDrv = ADL.DRV
audioDrv = AUDBLAST.DRV
joyDrv = NO
kbdDrv = IBMKBD.DRV
mouseDrv = STDMOUSE.DRV
memoryDrv = ARM.DRV
directory = \SIERRA\LB2
audioSize = 32k
minHunk = 206K
cd = no
resAUD=.\
resSFX=.\
patchDir=.\;audiosfx\.
Setting Up Other Games
There are other mid-90s Sierra games (e.g. Quest For Glory IV or Space Quest 6) which can also be configured in a similar manner so the CD is not required to play the game. For other games, the audioSize
might be a larger value, such as 63K. For Amon Ra, I had initially tried setting the audioSize
to 63K, but that resulted in the audio glitches, possibly due to a different audio driver is used for Amon Ra (AUDBLAST.DRV
), versus what some of the other Sierra games used (DACBLAST.DRV
). Otherwise, the configuration process is pretty similar. Happy gaming!
References
17th September 2018 | Programming
In Part 1 of this blog series, I displayed some examples on how to parse out various data from a game that used Sierra's Adventure Game Interpreter (AGI). In this post, I will cover aspects from a more complex program that is used to extract the views from a game. The views represent pictures such as animations and inventory objects.
In an age where game sizes are measured in gigabytes instead of kilobytes, the effort to save a paltry byte here or there is hardly worth it, but in the 80s, every byte counted. This post will also cover a couple of the interesting techniques which were used to make the most out of the limited resources of the computers and media of the 1980s.
Due to the memory and space constraints of the 1980s, Sierra's programmers came up with some interesting techniques to optimize the size of their files. Some of their programming trickery, such as requiring the program to be decrypted against a key phrase or switching between using both big and little endian was likely employed to obfuscate the files so casual hackers could not easily take a peek under the covers to garner some hints about the game. However, a basic compression method called Run Length Encoding (RLE) is used to reduce the size of the views, which works fairly well, especially when the same color is repeated in a row. However, RLE does not work well if there is a lot of variance in the picture, such as random static on an old TV.
Big + Little Endian
If one peruses an old Sierra catalog, they will see that Sierra supported a multitude of different operating systems and computers in the mid to late 80s. AGI was built to support each of these different systems so the same game resources and logic could effectively be ported to many different systems (in theory, at least — applying this task was likely much trickier).
The following table lists the various systems which Sierra supported in the latter half of the 1980s.
Computer |
Processor |
Endianness |
MS-DOS PC |
Intel 8088/x86 |
Little |
Atari ST |
Motorola 680x0 |
Big |
Macintosh |
Motorola 68000 |
Big |
Apple IIe/c |
6502/65C02 |
Little |
Apple IIGS |
65C816 |
Little |
Amiga |
Motorola 680x0 |
Big |
TRS-80 CoCo |
Motorola 6809E |
Big |
As you can see from the table, Sierra supported a wide range of architectures, which included both big and little endian processors. In the programs I've written to parse out the data from the AGI files, they use an odd combination of reading data using both big and little endian methods. There seems to be little reason to do this other than to obfuscate the file format and less on the particular type of processor.
The endianness of a machine was a topic which was much more carefully observed in the 80s with a variety of systems available. It's one of those areas we probably learned about in school, but haven't actively had to worry about so much these days. This section will be a quick refresher for all of us.
Let's quickly review the difference between little and big endian. A brief overview from the web page Byte Order - Big and Little Endian:
Little Endian
If the hardware is built so that the lowest, least significant byte of a multi-byte scalar is stored "first", at the lowest memory address, then the hardware is said to be "little-endian"; the "little" end of the integer gets stored first, and the next bytes get stored in higher (increasing) memory locations. Little-Endian byte order is "littlest end goes first (to the littlest address)".
Machines such as the Intel/AMD x86, Digital VAX, and Digital Alpha, handle scalars in Little-Endian form.
Big Endian
If the hardware is built so that the highest, most significant byte of a multi-byte scalar is stored "first", at the lowest memory address, then the hardware is said to be "big-endian"; the "big" end of the integer gets stored first, and the next bytes get stored in higher (increasing) memory locations. Big-Endian byte order is "biggest end goes first (to the lowest address)".
Machines such as IBM mainframes, the Motorola 680x0, Sun SPARC, PowerPC, and most RISC machines, handle scalars in Big-Endian form.
Since an unsigned 8-byte integer can only go up to 255 (28 = 256 values which are 0 - 255), if a larger value is needed, such as an offset to find a resource in one of the AGI VOL files (which contain much of the game's resources), then two bytes are needed to save a larger number.
In this example, the decimal number 49619 will be stored as two 8-bit numbers, 193 and 211. This is calculated by multiplying the high byte by 256 and then adding the low byte to the result.
193*256 + 211 = 49619
In binary, the numbers 193, 211, and 49619 are represented in binary as follows:
193 = 1100 0001
211 = 1101 0011
49619 = 1100 0001 1101 0011
For little endian systems, the least significant byte (the "little" end, which represents the value 1101 0011 in this example) would actually be stored first in memory, so it would be stored like 1101 0011 1100 0001.
Little Endian |
Memory Location |
0 |
1 |
Data |
1101 0011 |
1100 0001 |
The example code is taken from export_view.m, which grabs two bytes from the file, and then calculates the value. In this instance, the low byte is first and the high byte is second.
// Little Endian : Low - High
int lowResByte = getc(volFile); // res len byte 1
int highResByte = getc(volFile); // res len byte 2
int reslen = highResByte*256 + lowResByte;
In contrast, there is big endian which stores the high and low bytes in memory which looks more "correct" and conventional to how we read numbers by placing the first part (the big end) in memory first.
Big Endian |
Memory Location |
0 |
1 |
Data |
1100 0001 |
1101 0011 |
The example code shows that the high byte is read first and the low byte read second.
// Big Endian : High - Low
int ms_byte = getc(volFile); // high byte, most significant
int ls_byte = getc(volFile); // low byte, least significant
long signature = ms_byte*256 + ls_byte;
Side note: Another way to calculate the high byte value is to left bit shift by 8, so ms_byte*256 = ms_byte << 8
.
1100 0001 = 193
1100 0001 << 8 = 1100 0001 0000 0000 = 49408
193*256 = 49408 = 193 << 8
Run Length Encoding
One crafty technique which is implemented to conserve space with the views is run length encoding. When retrieving the data for a view, a byte is read from the file, which cleverly holds both the color and the number of repeated pixels of that color. Since there is only a maximum of 16 colors which can be used with these games, only 4 bits (half a byte) are needed.
This leaves the remaining 4 bits to detail how many times to repeat that color across the row. This might seem limiting, but for each pixel that is read, two pixels are drawn to the screen, so 4 bits can theoretically represent up to 32 pixels on the screen. If you look closely at how pictures are drawn (take Graham's nose from King's Quest 1 or 2), one will notice that the pixels are fairly wide, but they can be fairly short in the vertical aspect. This is due to the typical AGI screen has a resolution of 160x200, which is stretched out horizontally to 320x200.
Going through a number of the exported views, few of them are ever overly wide. Many of the character sprites might only be 22 pixels in width, so 4 bits can hold enough data for a majority of the time.
But what about background images, which might have large swaths of the same color (such as the sky or ground)? Backgrounds are actually drawn with vectors. If one plays an early AGI game (such as King's Quest 1 on an Apple ][), one can see the backgrounds being slowly drawn. For a plethora of other examples of backgrounds being drawn, check out the slow motion drawings at the @agistuff Twitter account.
The following is a code snippet from export_view.m to read and parse the byte containing the pixel data. The color index is stored in the first half of the byte. The number of times (loop indicator) to display the color is in the latter half of the byte. The example data will be the byte: 01000111
int pixelByte = getc(volFile); // NOTE 1
int colorIndex = pixelByte >> 4; // NOTE 2
int numPixels = pixelByte & 0b00001111; // NOTE 3
NOTE 1: Grab one byte from the file and store it as an integer into the variable pixelByte
. In this example, the byte of data is represented as the decimal number 71 (01000111
). The first four bits of this byte is the number 4, which is represented as 0100 in binary. The latter half of the byte, 0111 represents the decimal number 7, which indicates how many times to repeat the color (times two, since the width of the screen is doubled).
NOTE 2: To isolate the color index value, bit shift the value to the right by 4. As seen in the example below, each value in 01000111
is shifted to the right by four places. The four high bits are shifted to the lower section, which isolates the value. In this example, it leaves the binary value of 0100
, which is 4 in decimal. This is the index used for a look up table to determine which color to use (in this case, a deep red). To see the list of predefined colors, look at the initialization of the colorPalette
array in export_view.m.
01000111 >> 4 = 00000100
NOTE 3: To determine the number of pixels to draw, we'll need to isolate the lower half of the byte. Perform a bitwise AND operation against pixelByte
with the value 00001111
(or 0xFF
in hexadecimal). Reviewing our truth tables, only 1 & 1 will result in 1, whereas all other combinations of 0 and 1 will result in 0. So, to null out any of the high bits, we perform a bitwise AND operator with the operand 0000 on the first four bits, and then 1111
on the lower four bits to ensure that those bits are preserved.
01000111
& 00001111
----------
00000111
To isolate the concept of Run Length Encoding, I created a simple example program in Swift to further exemplify how to take a given string and perform run-length encoding on it.
Constructing + Saving an Image With NSBitmapImageRep
Once we finally get to the cel's data (that's not a typo — that's cel as in cel animation.), we need to take it and load it into an appropriate container object and then save it to disk. Since most of my examples are programmed in a mix of C and Objective-C, I use the Cocoa class NSBitmapImageRep
.
NSBitmapImageRep *bitmap = nil;
bitmap = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:image_width
pixelsHigh:cel_height
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSCalibratedRGBColorSpace
bytesPerRow:4 * image_width
bitsPerPixel:32];
The initializer for NSBitmapImageRep
looks daunting at first, but it is not quite as bad as it first seems. The most important thing to keep in mind when constructing this image, that each pixel will be comprised of an NSColor
object, each which has four components (RGBA - Red, Green, Blue, Alpha), and each of these components (or samples) takes up one byte (8 bits). A bit of quick math then shows that 8 bits x 4 = 32 bits/pixel.
Setting the pixel data in the NSBitmapImageRep
is straightforward by only needing to set the color of the pixel at a given position. Since the NSBitmapImageRep
uses an NSCalibratedRGBColorSpace
for its colorSpaceName
, each of the NSColor
s in the color palette use the [NSColor colorWithCalibratedRed:green:blue:alpha]
method.
An example color of bright green is defined by the following NSColor:
[NSColor colorWithCalibratedRed: 0.0 green: 1.0 blue: 80.0/255.0 alpha: 1.0]
. As you can see, each component can have a fractional range from 0.0 to 1.0, where 0 is dark and 1 is light, so an RGBA value (0, 0, 0, 1) is black and (1, 1, 1, 1) is white.
When looping through the data, each pixel color is set easily with the method
[bitmap setColor: pixelColor atX: x y: y];
After reaching the end of the cel image data, the bitmap is saved out as an image. In this example, the image is being saved as a PNG, but it is possible to also save in a variety of other modern image formats, including JPG, GIF, TIFF, etc.
NSString *imagePath = [NSString stringWithFormat:@"%@/export_views/%@_%d_%d.png", agiDir, key, i, k];
NSData *data = [bitmap representationUsingType: NSPNGFileType properties: nil];
[data writeToFile: imagePath atomically: NO];
Conclusion
Much of the enjoyment I have derived from reverse engineering how the AGI engine works is due to the clever techniques used during an era where the computing power was still woefully underpowered, so the programmers of the day had to be inventive in how they could stretch out the capabilities of these early microcomputers. Concepts and techniques I might have casually brushed against in a computer science curriculum (bit shifting, bitwise operations, big vs. little endian, run length encoding) are actually practiced here. Today we enjoy far greater resources so we need not have to resort to methods to pinch and save every last spare bit and byte, but it does not dismiss the practice that we shouldn't be more mindful about out code and assets and consider ways we can always try and optimize what we are working on.
References
29th August 2018 | Programming
Recently at my job the macOS Calendar has been having problems properly syncing with the Exchange server, likely due to a corrupt attachment. I came across this post by Michael Kummer detailing the steps to resolve the problem I had encountered. I went through all of the listed steps and it corrected the problem. Calendar resynced itself and the annoying warning disappeared.
Well, for a little while, at least.
The problem popped up again. Unfortunately, I haven't determined which particular event is causing the conflict, but for the time being, I had to go through all of the steps again. But if this is liable to happen again, the situation invites itself as an opportunity for a little scripting to simplify the process.
I whipped up the following bash script which performs four steps:
- Quit the Calendar app
- Delete all of the Calendar cache files
- Kill all running processes related to the Calendar
- Restart the Calendar app
Remember to set the executable permission on the script via chmod 755 clear_calendar_cache.sh
. To make things even quicker, an alias can be set up in your shell so the script can be quickly run from anywhere in the Terminal.
12th July 2018 | Programming
Out of technical curiosity, I wanted to inspect the internals of an Android app. I'm well acquainted with being able to download and inspect iOS apps on the Mac, so I was interested in how I might be able to perform the same task with an Android app and a Mac. Finding the answer proved to be a little more convoluted than I initially expected.
There are several tools and esoteric methods to try and download an APK to a Mac, but my initial cursory attempt did not meet with immediate success, so I investigated other routes. There are many options to transfer standard files (photos, documents, etc.) to and from an Android device using the venerable Android File Transfer app, but that did not allow the capability to transfer an app from the phone onto the computer.
I was already familiar with using adb
to sideload an app onto an Android device, so I figured that the reverse might be a feasible solution. Indeed, this is possible. This took a couple of steps from the command line to copy the apk (named myapp
in this example) onto the Mac.
- Get a list of all of the packages available on the device:
adb shell pm list packages
- Search for the identifier of the app you want and run the following command to get its path:
adb shell pm path com.example.myapp
- Finally, copy the apk file to the selected destination on your computer:
adb pull /data/app/com.example.myapp-1/base.apk path/to/destination/
Similar to how an iOS app's ipa file is just a loosely disguised zip file, the same applies to the apk file. One can use a simple command of unzip myapp.apk
to dump the contents of the apk file for further inspection. However, some of the files, such as the XML files, might be saved in a binary format, so it is not read as easily as an unencrypted text file. This was a good start, but some further work was needed to be able to more thoroughly explore the package's contents.
Android Studio can also be used to open up an APK, but since I did not have that installed on my laptop, I opted for the recommended tool apktool. However, if you have Android Studio already available, that is the ideal route to take.
I installed a current version of Java and the apktool. Once it was in place, I was able to extract the contents of the APK using the command apktool d myapp-base.apk
. This dumps the contents of the apk into a separate folder, making the xml files, assets and other support files readable.
I hope that this small tutorial proves useful for anyone else who is also interested in being able to take an app from an Android device, put it on a computer, and then inspect its contents to see how an app was constructed. Happy exploring!
3rd July 2018 | Programming
I've been thinking a lot about King's Quest lately.
My most recent bout of Sierra nostalgia is likely due to having met a merchant at a comic con selling
shadowboxes and prints of classic 8-bit games, such as this piece of desktop art of the opening screen from King's Quest 1. Now this scene of verdant greens and waving flags calls out to me on a daily basis, beckoning me to delve deeper and learn its most difficult of secrets. No, not the location of the three treasures or even the name of the gnome (ifnkovhgroghprm), but the secrets of unraveling the Adventure Game Interpreter (AGI).
When King's Quest was first released in 1984, it was quite a progressive game for the time. Although I do not recall the exact details, it is quite possible that the first Sierra game I ever played was the original King's Quest back in the mid-1980s. Even thirty-some years later, I am still ferreting out new secrets from this game (e.g. Zombie Goat, Walk On Water, Walk Through Walls), however my interests now delve deeper than finding interesting quirks and easter eggs.
A couple days ago I was curious if it was possible to reverse engineer AGI, the game engine which ran fourteen of Sierra On-line's games in the 80s. The original game (and its various remakes) have held up fairly well over the years, but there are still a number of fun "what-if" type of ideas I would love to implement in the AGI version of the game. While I am not interested in recreating the entire game, I think it would be a fun exercise to be able to hack the game and make some custom edits to further build upon the 8-bit realm of Daventry.
One quick Google search revealed that there are quite a few resources on creating your own AGI game or inspecting original game resources. Many of the references and tools date back to the late 90s, but some of the content is still quite useful. One of the most interesting things I came across was Barry Harmsen's presentation Reverse Engineering 80s Sierra AGI Games. You can see the results of Barry's magnificent digital spelunking at the Twitter account @agistuff. Barry has written a number of Python scripts which can extract the data from the common components found in AGI games. However, he is not even the first person to write a series of programs to extract AGI resources. If you look through the various pages of the AGI specifications, there are quite a few contributors, especially Peter Kelly who wrote several programs, mostly in Pascal.
It cannot be overstated how amazingly impressive the work is that has been put into figuring out how AGI worked. Inspired by the work previously done by Barry Harmsen and Peter Kelly, I wrote a small program in a mix of Objective-C and C which parses out the words in the WORDS.TOK
file and then saves the results into two files: a plain text file and a JSON-formatted file. I mostly followed Barry's Python example, but did take a couple bits of logic from Peter's Pascal code to get my own example working. The following is my code example I used to extract the words from the 1987 PC version of King's Quest 1.
This is just my initial dip into the AGI pool, and I hope to return to dive into it deeper. Many thanks go out to Barry Harmsen, Peter Kelly, Lance Ewing, Claudio Matsuoka, and everyone else in the AGI community. Another tip of the ol' adventuring cap goes out to the designers of AGI: Jeff Stephenson, Chris Iden, Sol Ackerman, Robert Heitman, and likely many more who had a hand in creating these Sierra classics. Some people programmed microcontrollers for traffic lights, others developed websites for now-defunct companies, but a few people have been lucky enough to have worked on a product which still garners people's attention even thirty years later. Well played. Well played, indeed.
21 July 2018 Update: I have added two more small programs to extract the directory structure and the inventory objects for AGI games. I am also compiling the extracted words and phrases from a number of the AGI games. The lists can be (*ahem*) interesting at times to see what words are recognized. In some games like KQ1 and SQ2, the programmers took the liberties to include their names as recognized phrases (Mikel Knight and Scott Murphy). KQ2 in particular has some unusual (and unusually naughty) terms, as the programmers took other liberties to slip in some not-so-innocent terms into the vernacular.
It is interesting to see how similar terms are grouped together so the game logic has an easier time understanding what is being intended. In King's Quest 2, a number of terms and names for female characters are grouped together. This means that Hagatha, Valanice, maiden, woman, grandma, mermaid, little red riding hood, girl, and hose bag are all equally recognized, even if the actual context in the game is not correct. So you can say kiss valanice
while in Hagatha's cave, or say marry hag
while in the Quartz Tower (probably not a good way to start off the marriage, Graham!).
References