Thursday, December 12, 2019

Server Stories, part 1

An associate of the Retro Corner

I have a corner desk in our living room with few old computers, and one of them is a Fujitsu-Siemens Esprimo P. This Pentium 4 CPU equipped cream coloured mini-tower computer is an ordinary computer from late 2000's. An exact model and year of release are not in my knowledge at the moment; it looks outside like Esprimo P5905, which also used the same D2151-A1 motherboard. The 5905 was at least originally shipped with Windows XP though, whereas this unit came with Vista, which was not even released yet in 2005 when this review (https://www.alphr.com/desktop-pcs/28016/fujitsu-siemens-esprimo-p5905-review) was written. Probably at least essentially the same computer nevertheless, just presumably sold couple years after first release as a low-price model. As a side note I actually bought a Fujitsu-Siemens computer in 2005 as my first own genuinely modern computer, and I was quite happy with it, but that was different model and overall it is another story.

The computer barely fits into retro category, but I'm also not using it for retrocomputing purposes; this is my new home server. Or at least it is intended to be one. So far I have installed some new hardware on it (960 GB SSD drive and two 4 TB hard drives for storage purposes through a PCIe SATA-II card), but most importantly I have installed new operating system on it: openSUSE Leap 15.1 Linux.


A fake server hiding under family art.


OpenSUSE Leap 15.1 is stated to require as minimum system requirements 64-bit CPU, 1 GB of RAM (2 GB recommended) and 5+ GB disk space (https://opensuse-guide.org/installation.php). My old F-S computer's specs are easily above minimum: the processor is 64-bit version and there is whopping 3 GB RAM on it. It should be possible no problem to get even 64-bit Windows 10 installed on this, but I just wouldn't want to actually using that combination. However, installation of a modern OS did not go through by default settings. I made my boot DVD and the computer booted with it no problem (the installation even had a fancy winter/christmas theme if I recall correctly for the first try, as the computer's Real-Time Clock battery was missing and hence the time was well off), but when I tried to start installing, the computer worked on quite a while until eventually process counter froze at 100% and nothing ever happened.

Shots into troubles

In a typical modern installation guide like in the homepage linked above there is no troubleshooting, and the mostly self-explanatory installation process just goes through like a dance when you're the prom queen. Of course most people wouldn't go install new operating systems to old computers anyway, so the inevitable issues would be met by only us few...in any case I had to figure out what went wrong.

The thing was with BOOT menu settings for Kernel Default, which had to be changed first by pressing F5 on bootup/install menu with the DVD's GRUB startup. This computer was old enough to either not support APIC (Advanced Programmable Interrupt Controllers) at all or at least not the version expected, so I had to turn it off for installation. Later on I had to edit my /etc/default/grub file probably due this, otherwise the computer would not shut down by software commands, but after that the ATX software power control seemed to cause no issues.

Anyway, after this setting change I was able to install the new Linux distro. I've never had openSUSE, so it would be a new learning experience after having mostly used Debian based distros. Especially as I opted to not use GUI at all - there wouldn't be excess amount of computer resources to waste, and for most of the time I was not intended to be exactly sitting next to a monitor with this computer anyway. SSH would be the way of control for this partner in computing, straight from my main workstation the next door. I'll just need to install SSH server (sshd, actually SSH daemon) and set my firewall settings properly, including swapping the default port 22 to something say 5 digit numer for additional security. First things with new Linux distro comes up: to install stuff I can't use apt-get commands but zypper, and the default firewall application is not UFW but runs just by firewall-cmd.

This being said, about not having GUI, I do recommend installing fish shell for making life with CLI a bit easier - even if using terminal just occasionally. Fish shell gives a bit Powershell-like option to browse available commands by TAB and does other usability assistance features such as adding colours to terminal text a bit similarly as programming text editors do (ie. IDE applications). Default Bash might be powerful for sure, if you know how to use it, but that is essentially just an enchanced version of 1970's technology. Barebone Unix shell can be really daunting to use if you have no long experience, no tutor to support you, or no proper thick manual within reach.

Yea, you can google around every command separately if you have another computer around (or use remote access), but googling is just a terrible mess in comparison to having any of those three. Googling gets stuff done, but it's really like going to nearby woods to fetch some berries or mushrooms; you might know how to find them, there might be fresh stuff around to pick, and then it's just a matter of time, but it sometimes takes a lot of time to find what is needed and usually there is need to search for multiple places at once. Not to mention that lots of data in Internet is already well rotten with their due date already ten years behind or it's worm eaten to begin with, and it can be hard to see from afar which one is a good pick - at least with books it's possible to judge by the cover (or front pages) when it was originally released and that books in theory are more prone to have more certified facts. That of course is not always true either, but now I'm getting sidetracked.

Server is up and sitting

So now I can access my fancy home server from another computer in my local home network (actually I set it up to be accessable also from outer web, but that needs to get more strict security rules on how who and where). What then? What to do with a home server? Many people don't really seem to have a clue what to do with a server, and I suppose it is understandable, especially as I can't do just whatever with this. Nevertheless, let me think of few options.

1. File server. I can set a private file sharing server that can also provide network drives for all local devices we have at home. Alternatively it could be used for transferring data to other people. Sure there are all kind of free cloud services available, but if you care about privacy the free ones become a bit less tempting. On the other hand if you need space for hunderds of gigabytes of data or even terabytes (video material in my case hogs probably like 80% of my used terabytes), the free services are just not sufficient and paid services might actually become pricey with higher storage uses.
2. Web page server. Well I'd lack DNS support, so people would need some fancy http://111.111.111.111/sakariaania style address to access it, but I might be able to use it myself for something like own data, easy checkup whether the server is still up, and could provide direct links for people who could need it. Would always amuse me to have own www server at home.
3. Surveillance camera. "Why would you need that?!?" Yea, I don't have nothing to steal and I'm always at home anyway, right? Any home could be broken into even if there is nothing really valuable inside, and getting photos of the intruders and time of the event might help a lot on figuring out the possible crime. Like risk of fire, such things are unlikely, but it's still best to have some preparation for this unfortunate thing to happen. On more practical side though, I could check if there is any mail arrived in case I'm actually not at home.
4. Mail server. Okay, for this too to work properly I'd need some extra services set up, and I might be quite content with having some other email addresses...but at least it would make it possible for my server to send email notifications if some service fails - or even to build up some kind of MFA system.
5. Remote control. Technically if there is anything I can control by a computer and I wish to have remote access for controlling that, a server with just SSH access could become a tool for that.

There'd be of course various other possibilities, but those would hit as something realistic and usable.

In addition to this all, for me it is also largely just about testing and learning. If I'd really need a server, I could more easily set up one, and also I may encounter things that can help me understand things I encounter at work. Hands-on method teaches a lot more to me than being given server IP and credentials for remote logging some server so I can do some Active Directory management for instance. That doesn't really tell me 'what' is the server for real. By setting one up I'll get a bit better impression - even if no enterprise environment would rely on setting up a server on 10+ years old home computer, especially as nowadays hardware servers are less and less frequently set up when you can just have virtual servers.

RAID over Sakariaania

For now I'm only going to go for the file server option, as that I actually need. Since I bought two identical 4 TB hard drives for this specific purpose, I also felt like testing RAID 1 state with them. For a moment I thought my PCIe SATA card with RAID controller would actually work as a hardware RAID, but then I found out that these are deemed as "fake raid" devices. Not only that, but especially on Linux it is recommended to use software RAID setup instead both because the "fake raid" could cause technical issues (including reduced reliability: should another drive or the computer itself fail, the other drive might not be recoverable on another platform) and because Linux's own software RAID application is supposedly very good and robust. It just might, maybe, take slightly more resources than the hardware controller supported "fake raid". Is my old Esprimo up for the task?
SATA-II PCI-Express Card still unpacked.

First shock came after installing the card and drivers. The card itself was supposed to have its own BIOS for automatically setting up a RAID that could be started by pressing a function key during computer self-test. I guess the key was correct, since pressing it froze the whole system. When I merely started the computer I could not find the drives at all. Oh dear, did not the card work on this computer at all? Silly me, however, I forgot that the new HDDs were completely unprepared. I went on my other computer, created partition tables and formatted the drives. After that I could find the drives on my Fujitsu-Siemens as devices but not from the file system. For a little moment I also had forgotten that on Linux like this the drives would not be automatically mounted either...such corruptive effects all those modern convenience features can cause. So after partitioning, formatting and mounting the drives could be found on the server computer - as their separate entities.

The Linux application 'mdadm' is an application with which you can set up a software raid on whichever partitions on whichever drives available, so that is the recommended option for setting up a software RAID. I can already tell that it worked no problem and wasn't too hard: just needed couple commands to assemble the RAID and then to enable it. As another option I'd had potentially 'dmraid' application with which it should have been possible to work with the PCIe card itself, but I ended up not even trying that for real. Like referred to in the previous paragraph, I was never able to reach this card BIOS on this computer, which was a bit of a bummer. Fortunately, the disappointment faded away gradually after I first found out that this 'dmraid' could have set up the necessary things, and even more after I found out this that it's not likely a good setup anyway.

If I'll now check on my server terminal for instance disk free status on all drives by command 'df', I'll have this RAID disk called /dev/md0. After a moment I can mount it as a network drive on my main computer. Mission successful? Not quite. This is just a start. On the other hand, because this is also a test, I will yet change my setup with drives a bit.

First test of copying was not perfectly impressive though. The computer is rather silent normally, but when I for the first time tested copying few GB worth of data over the SSH file transfer protocl just for testing functionality, CPU hit 100% and computer fan jumped from like 1100 to 2600+ RPM and that made it rather loud. It was night time too...oops. Copying locally from drive to drive the CPU hogs "only" around 80% and doesn't make much sound. Normal idling temperature of the fan is like 40-50 C, yet this 100% CPU transfer situation took the temperature close to 80 C - but barely further. So temperature was not that bad, the computer made it ok and main issue was just the fan. I might consider either swapping the fan or adding another silent one to make cooling go better at some point.

Nevertheless, this test made me reserved with the RAID setting. Was the software RAID too heavy after all or was it just the SSHFS? On my subsequent file transfers there was no similar heavy fan load for whatever reason, and when I checked the process resource drainages, actually it was not mdadm which took resources but sshfs. Also it took lots of resources to transfer to the non-RAID drive as well, so I suppose the software RAID is not really a problem. It just is heavy process for that computer to transfer the files over the connection. Although that is somewhat difficult to say, since the md0 synch of RAID drive seems to cause multiple processes that can take few % CPU even when nothing is really done, so that might stack up on unnecessary drains on an old computer like this. I'll probably eventually remove the RAID after testing - it's not really needed after all, but I had never tried it. Instead I can just use scheduled backups every now and then.

Linux command line has also this convenient command 'time' to check up duration of any command. Therefore I could test transfer speed. So by hitting the command "time cp [/source/path] [/target/path/through/sshfs]" I got following results on two different tests:

From local HDD to server RAID HDD.
real    5m55,582s
user    0m0,212s
sys    0m25,496s

From local HDD to server SSD.
real    5m31,433s
user    0m0,238s
sys    0m25,763s

My test package was 15.9 GB worth of video files. Transferring to SSD was a bit faster than to the HDD RAID as was expected, although I'm also sure the SSD can't use its full potential on that computer. Also I'm actually thinking the motherboard SATA is only SATA-I whereas the PCIe card should at least in theory provide SATA-II connectors. Although I guess 45-50 MB/s might be fairly decent transfer speed for this kind of a setup. I'll have yet some more tests to go with for further conclusions. Also I might yet test if I can actually get the RAID set up by 'dmraid' command set. We shall see when the next part comes.

Can't deny though - this kind of a warrior pair is more or less asking to join a RAID.

Wednesday, August 28, 2019

Retrocomputing 4 from Outer Fiction

Because this is kind of a diary and I like to keep record on books and movies I'd gone through, I'll in passing mention that I read a book by Brian J. Robb: Vastustamaton Brad Pitt (original title The Rise to Stardom, 2009). Nothing special about that as such, but I suppose few thoughts of it in addition would be appropriate. The book was quite nice to read, yet it's not something I'd think I'd read again. Got some information about movie with ties with Brad Pitt, and I guess he could be deemed as a more noteworthy actor than the pretty pin-up boy reputation which he himself clearly disliked would have expected. Also I have to deem it is kind of sad even that in Finland it is actually relatively hard to find books in their original language even if they're in English - I only read the Finnish translation as I happened to get the book from library removals. However, this book and thoughts about it have nothing to do with the topic tonight.

On the other hand, some other books and films I've met recently have more remarks on it. Should I not be interested of retrocomputing or alternatively had I read and seen those works of fiction while they were all new, I might have not really paid that much attention. The book I'm referring to is Syvä kuolema (Deep Lies, 1986) by Stuard Woods and the movies I'm thinking are the two agent Jack Ryan films based on Tom Clancy's books starred by Harrison Ford: Patriot Games (1992) & Clear and Present Danger (1994). For those who are potentially considering reading/seeing those without having done it before, I'd like to notify that I'm not planning to give out any major spoilers. I'm mainly going to talk about certain technology used in the fiction.

What combines all of those three is that they're agent stories. I used to think that I'm not interested about such stuff. In fact even while Tom Clancy's The Hunt for Red October and Red Storm Rising were of my favourite books as a kid, I didn't really like any later books from him as I felt they were too much of agent stories (I think I did read The Sum of All Fears and Clear and Present Danger in Finnish). Therefore I found myself surprisingly indulged in these plot based fictional pieces of work. Might be that now as older I think differently, but also these feel fresh to me since I'd not really gone through this kind of fiction much. Also reading plot based books feels fresh after so many years and pages of research books from the university.
I suppose I need to get this bookcrossing
case back on the road soon.

Especially this Deep Lies surprised me, as I was expecting some quite standard level nothing too special plot based thriller. The cover actually mostly amused me, especially as it has few stereotypic spies in trench coat, shades and a brimmed had which the book has not really at all, and which hints that the cover artist has not exactly read the book - like it usually goes. I also do have to admit that much of the things happening there plotwise didn't really impress me (many things seemed too foreseeable in a long run) and if the twists are not keeping in tightly enough, the "thrill" element goes quite much wasted. That has been actually a major reason why I would not care much of thrillers and horror, as the expected emotional effect would hardly ever impact me. Then later on I've found other fields of interest in both genres.

In any case I was wondering why would not Deep Lies be classified as a techno-thriller, and that what actually then makes a techno-thriller over a normal thriller. A significant plot device in the book is based upon using computers and other technology instead of traditional on-spot agents. Otherwise submarines starting from Whiskey-on-the-Rocks incident at Swedish coast in 1981 (a Whiskey class Soviet submarine got stuck on rocks openly visible on surface close to a Swedish naval base) are in significant role in the book - there are no similar new innovations such as Russian stealth submarine from The Hunt for Red October, but otherwise I could not feel thinking that this Clancy's books might have well been one source of influence while writing the novel.

As a retrocomputing hobbyist the most amusing part in the book was nevertheless the fact that the people were installing IBM PC AT based multi-user multiprocessing computer network system to make things work smoothly and have data stored and shared for those who needed it quickly. Some Western agents were mocking Soviet computer capabilities, as they didn't even have MS-DOS compatible computer systems. Also there were several sections that were depicting the text PRODUCED BY THE COMPUTER:
ON ITS SCREEN
WHILE THE USER
TYPED COMMANDS.

LOADING RESTART SEQUENCE FOR BLOG NOTES... PLEASE BE PATIENT.

It was argued that those models of computers were hard to get even in USA at the time, so I suppose the book was supposed to be set in around 1984/1985, as the IBM PC AT was originally released in 1984. I can not recall it would have ever directly stated when the book actually was set to happen, but it clearly was intended to depict contemporary times of the book release. By 1986, when the book was released, however, I think PC AT models were not that rare anymore, as the IBM PC AT model itself was discontinued by 1987 and Compaq had already released a 386 processor based PC compatible in 1986. Regardless, big 5.25" floppies were there to save the day.

Then for the Clancy novel film adaptations with Harrison Ford. I was constantly trying to look at the computers they used in the offices. I even had to stop, rewind and image freeze another of them just to take a closer look on a computer on the background - I believe it was some IBM PS/2 model computer. My wife was left rolling her eyes when this happened as we were watching thise movies together: Is this guy here just to catch computers? To me it mattered a lot to the atmosphere too, since the movies were meant to set in 1980's, and had someone had for instance some CD-ROM equipped 486 PC on his desktop it would have felt a rough anachrony over there. However, all the models seemed appropriate for having been used in CIA office as "the best tech we got" in late 1980's, and details like that give a good impression for the film.

Another thing worth interest in my opinion was that in Patriot Games there was a scene unlike I could remember having ever seen in any other movie. Maybe I have just not seen such movie, but in any case. There was one action scene that was depicted almost entirely from the satellite link vision on the office agents and managers: all the soldiers were seen through satellite camera in bird perspective and it seemed just like some 2D computer war game. The whole scene distanced the real acts very efficiently and in such distinctive fashion, that it forced to think about how it would feel like - you have to make educated guesses by fragments of technical intelligence data, and then you can just send in the troops without any direct involvement. Should your guesses go wrong, you could risk lots of innocent lives, and in any case someone else would do your dirty work when needed.

The scene I'm referring to can be found from YouTube actually:
https://www.youtube.com/watch?v=ZoVWedQOQl4

I suppose the studio executives and movie producers would nowadays deem that this kind of a scene would not appeal to the modern audience sufficiently due to lack of closeups with fancy CGI explosions in colour and other stunning (and often silly) visuals. To me the scene felt much more realistic than typical action scenes in movies, and most of all it felt much more emotionally impacting than the typical "show it all" movie entertainment. It's a bit similar effect in movie narration as when the camera is turned away from the actual murder and then it would feel more terrible than if the kill is actually showed. Still I'm somewhat surprised that this kind of scenes are not more commonly used in movies.
Would you watch a movie with this cover?
Well I did, and was positively surprised.

Then finally as a bit of extra, I saw just today a movie called Speed Racer (2008) by Wachowski brothers (The Matrix). The movie was a box office failure, and I was expecting it to be rather bad, but I found it actually quite watchable material. I'd deem it at least more or less of a good bad film (ie. a movie with various things done badly but that it's quite entertaining to watch). The style of visual narration is kind of cartoon-like (well it's based upon an anime with the same name actually) in quite personal and even unique fashion with overwhelming amount of certain kinds of effects and transitions. The movie is supposed to be a family film, but some scenes might be a bit questionable for the youngest kids. In fact overall it would surprise me should this movie not have attained at least some degree of a cult following. I definitely could recommend it - with reservations - despite certain silliness and some nonsensical factors; it's not even trying to be realistic anyway, it is basically scifi stuff, and it has some almost dystopian visions of corporate power settings. It could be even compared to Death Race 2000 from some parts, such as when in the Casa Cristo "rally" there are stylized teams with military and barbarian theme, among others, with the team leader of latter exclaiming: "Crom!" On the other hand the movie's heavily computer generated colourful world reminds me a bit of Tron (1982, the first feature film made almost exclusively with computers). Still I can see how this wild gamelike action movie with partially almost surreal montage and partially achronological narration would have been rejected by quite a few people in the West.

Speed Racer, however, unlike the Brad Pitt biography, is related to the topic. In the first half of the movie there is a scene with a corporate manager telling about his own past in passing, how he had to work hard himself just like other people before his company got so big and powerful. He worked hard with Commodore 64 in order to reach the greatness he is in now. Can you imagine? Considering that the setting is for some parts quite (retro) futuristic or at least of alternate reality timeline world, it felt even off-the-place to have a real computer classic being mentioned there like that. But maybe Commodore 64 was the professional computer workhorse in the alternate history of that movie unlike IBM PC and compatibles, and as such I'm not complaining about such amusing and actually unexpected reference.

Sunday, August 4, 2019

Theory of Books


A Book of Plot, a Book of Knowledge, a Book of Curiosity and a Book of None
One evening after an enjoyable reading session on Finnish translated version of James Clavell’s Shogun, a theoretical lightning struck next to me with a question answered to: Why to read a book? The answer become formulated as part of Zacharian literature theory in such fashion that I could see eventually four possible reasons to read a book. The book could be (1) a Book of Plot, (2) a Book of Knowledge, (3) a Book of Language or (4) a Book of Curiosity – or a combination of any of the previous four types.

The book Shogun I could immediately classify as a Book of Plot: the story and the unfolding events were a significant point of reading. What would happen next and how would the preshadowed event of X and Y end up going by? Narration of the book based much upon storytelling even while rough ending was known from historical point of view from the beginning (the protagonist Blackthorne, based on a real life character, would become a samurai and vassal of the next de facto ruler of Japan) and hence the plot was clearly an important factor in this fictional yet historical novel, which was also obviously intentional by the author.

Due another Zacharian literary theory concept maximalism, in which more is better if it fits the same space as easily as less, I figured I had enjoyed the book more than if it would have been merely a Book of Plot. It took me but a turn of a page to figure out that at least for me the novel was also a Book of Knowledge: I learned a great deal of Japanese culture, history and even language while reading the book, even while I acknowledged that much of the history told was more and less romanticized, nevertheless. Admittedly though, certain things such as ninjas as mythical warriors do originate from Japan itself and are not simply Western misconceptions; images of fictional ninjas clad with almost legendary abilities in black were concepted in Japan in early 19th century or so – hundreds of years after historical period of ninjas, similarly like popular culture visions of Caribbean pirates are largely based upon non-contemporary fictional literature such as Treasure Island.

Thirdly I ended up finding up the novel as a Book of Curiosity, after I had talked about the book with my wife (who had not read it, and by my assumption will not). The Book of Curiosity in my mind meant that the book would have something of interest that would be not due anything that is directly written between the covers. Meta content would be of something curious. I figured it could be something special about the author or the book success for instance; in this case it was that the book apparently after it’s appearance in 1975 had become a significant source of Japanese culture interest rousing in West after the Second World War. I can not confirm how successful the book really was in that part, but at least according to potentially fictional Wikipedia remarks it seemed to me like the novel would have been rather popular and was a major source of inspiration towards Japan in the era when there was practically no manga and anime in the West (to be honest the modern manga and anime field was still in infancy also in Japan before 1980’s in my opinion).

As a Book of Language I was unable to define the book as, since I didn’t mean that as a studybook of foreign language (which would classify as a Book of Knowledge), but rather as something which could be called as poetic. The novel has few poems in it, but they’re more like background imagery, and the actual language seems relatively plain. I did learn what means ‘tacking’ (‘luoviminen’ in Finnish), although I was familiar with such practice of sailing against the wind, but in general I didn’t find the language itself as anything particularly fancy or special. Of course I don’t know if English original was different, but I doubt it – the text just focused on telling the story for the most part. I also was first calling the concept as the Book of Words, and frankly, I’m still not certain which one would fit better.

To promptly test this new theory further, I tested the concept into few other books I’ve read recently. Namely I can think of three more: William Rodgers’ Think; A Biography of the Watsons and IBM (or more specifically Finnish translation: Ajattele! - IBM:n tarina), Sofi Oksanen’s Stalinin lehmät (Stalin’s cows would have been the English translation) and Aki Rantala’s Linux. Oksanen’s novel I did not finish, as I felt it too exhausting to crawl through almost 500 pages of it (in comparison Shogun is around 1200 pages and I could have taken more no problem). Rantala’s old Linux guide has but few pages left, and some parts were very interesting and some parts I almost had to force myself to read. Finally the Rodger’s sort of a biography book I almost regretted having started until the latter half of the book.

The Watson-IBM book I read first and it’s been already a while since I finished that. The Finnish translation did not mention it as clearly as a biography of Watson, so I was looking forward to read more about early history of computers – and in that I was disappointed. I did know that the original book was released already in 1969, so it could not yet have anything about home computers of which I knew more and expected to be more interesting, but I was not expecting the book to start from 19th century telling about a salesman who grew into power and richness with rather questionable practices with occasionally double standards and even immoral behaviour. What even worse, the Watson himself seemed to be a businessman who didn’t really understand about computers when they finally came available, just it happened that his mechanical cash register manufacturing company was well capable of producing them when the technology came available.

So this IBM book is not a Book of Language – quite plain text and largely relatively simple expressions supposedly maybe in attempt to seek for Watson’s spirit, who was made to sound like a dictator who gives simple message to his simple folk and who was admired by the author. The first part kind of takes steps towards a Book of Plot, but that doesn’t really work out in my opinion. I didn’t find it very interesting to get to know how Watson manages to obtain his next monopoly region, especially as often the details of narration told it in fashion that “Watson was a skilled salesman” and as typical for a biography there was no fully consistent story to tell, although that is natural of course.

What the book is, however, is a Book of Knowledge, although is somewhat limited fashion. The book didn’t seem very critical, and it was quite hard to estimate whether some unreferred claim was true or not. Was Watson as innocent as he seemed to suggest during his earlier days, and was he really that sincere while being polite to some salesmen? Maybe, maybe not, but the book does not exactly allow checking any facts itself but tries to proceed with old-fashioned shut up and listen to the story “factual” narration, which obviously doesn’t always feel like the complete truth. Yet no doubt there are many truths, so despite its potential missteps, it still contains knowledge.

However, perhaps more the book goes to the field of a Book of Curiosity. It still gives certain background history for IBM, which eventually did not become as world dominating and unstoppable in the field of computers, largely due unforeseeable rise of home computers in turn of 1970’s to 1980’s, to which IBM was not as swift and successful to get to. Yes, despite IBM PC become the computer which paved a way to a standard upon which even modern computers are based upon, I still find it kind of unsuccessful if the intention was world domination. IBM PC compatibles prevailed, but IBM itself faded away from the point of view of a regular consumer and largely started to return to its mainframe production after 1980’s. In the book it is speculated that nothing could prevent IBM from becoming permanently the biggest company in the world. On the other hand in the end of the book there is some curious pondering such as about how it might at the start of 1970’s seem like wild imagination but that computers will eventually come everywhere – it took a moment, but we are there now.

In summary, I suppose it was a book worth reading, yet I could have lived without it as well. The theory seemed to work out, since it felt not worth reading only when I was unable see the curiosity value content or knowledge, when there was not else really in the whole book.

Rantala’s book about Linux is another Book of Knowledge with hardly even a trace to a plot narration or poetic narration. It won’t even go with the curiosity side, since books like that are several, and I don’t really see why mere age would make it so special in this context – the book is from early 2000’s. The age also was a problem at times, since some parts especially when it was talked about things like concurrent Linux distros, GUI applications and installation of a then new Red Hat Linux version simply felt obsolete and as such irrelevant. That took a step towards curiosity and could have deemed as historical knowledge, but for most part that just didn’t give me much.

However, overview and history of Linux plus especially Linux core operations and console commands and scripting would still be surprisingly relevant knowledge despite being over 15 years old data. How much one could get any usable information for a modern OS while reading about Windows guide from over 10 years ago? Someone might argue that this would indicate how Unix based OS would be just some old nonsense, but I’d say such thoughts would be wrong. Rather I see it upkeeping consistently the past and not discarding long-learned and applied practices after every few years, like in computer systems seems favored rather often. Admittedly though, large amount of Unix/Linux console (Bash) commands are completely unintuitive for a beginner, and without some external knowledge it would be quite hard to figure out what does commands like cat, df, finger, top or dpkg. At least GUI gives more direct hints, usually, likewise PowerShell commands are usually logically understandable English.

Finally the last poor book which I decided to not even bother to finish. After over 100 pages I had not really found any proper plot, just fragments of seemingly irrelevant narration of individuals from different eras that would be of course connected by family ties or acquaintanceships of parents and such. Of course it might be partially due myself, but this kind of postmodern narration simply gives me nothing, and in my opinion serves nothing that matters either. Yes, sure, we can get these very subjective images how it’s hard to live as a bulimic or how someone has felt her life as a woman in Soviet era Estonia has been difficult, but to me large parts of it felt just about as interesting as reading a phone book (just names swapped with foods eaten and numbers with counts of vomits) and recallings of past how I should have passed a tree from left instead of right and maybe then something would have been different when arriving the other side of nothingness. So basically there simply was no plot based narration to me that would have mattered at all.

The writing style would not go for the Book of Language either, because I felt the language used was relatively simple and poorly structured. Might have been intentional in order to rouse certain feeling where nothing matters to the protagonist, but it’s a fail if the book starts to feel so monotonous to the reader that it simply does not matter. Also the attempted steps towards a Book of Knowledge with historical references seem too subjective and fictional to be tied to almost anything significant to know about. That they had lots of Russians and their influence in Estonia for decades in 20th century is not exactly a news or anything that would make this book worth reading. References to Estonian resistance fighters Forest Brothers (metsäveljet in Finnish, metsavennad in Estonian) do go close, but unfortunately those historical glimpses are just rare drops in the sour pool of tears and did not impress me enough to really give it a class with their maybe one tenth of the content (did not count).

One funny occasion came across to me though, which was recounting of a story of a foreign woman selling herself in Finland after the collapse of Soviet Union who carries a sign: “pilu 50 mk” (meaning ‘pusy 50 marks’ with the first word misspelled, and the sales price being extraordinarily low for the time). This is funny mainly because it almost goes to the curiosity class, yet it actually seems to be a common unconfirmed urban legend which apparently has been told having as if happened like all around Finland. My mother even told it in 1990’s having happened in Heinävesi (a small town in Eastern Finland) where this woman supposedly was in the town market square...obviously she never saw this woman herself. I had forgotten this story, but by then the latest I saw how impossible story this was to be for having happened for real – or at least should it have happened for real in all the places it would have occurred for real, it is strange how no one seems to have first hand eyewitness experience of the case (the nearest you can get is “I know personally someone who saw a woman like that”) despite the poor woman toured apparently around wide regions of Finland keeping high public profile.

Anyway I don’t see this as special enough for real curiosity value, and the book is not even Oksanen’s breakthrough novel or otherwise too significant. Her later work Puhdistus felt a bit better, but I have to admit I’m not really impressed by her writings overall and I am kind of ashamed as I’ve over ten years ago suggested her to some foreign person as a Finnish author to read – not because I’d liked it myself, as I had not yet read any of her stuff, but as she was at the time become notable and praised author in Finland. Never praise something you don’t have first hand experience of.

Somebody might not be happy for me to dislike this Oksanen’s book, but on the other hand I doubt many of such people will ever encounter to read this text. Therefore overall the theory would seem to work again – Stalin’s cows did not feel like worth reading overall, likewise it did not reach a class in any of the four specified types. Some steps towards, but only partially.

Still, I do admit that my classification interpretation is quite subjective for several parts. Someone might feel plots differently, for someone knowledge in whatever strikes to a whole different level, for someone the curiosity might fit a very personal meta level and for someone language’s beautiness is depicted with all different words and so on. Another issue is that the Oksanen’s book could be argued to have certain virtue of delivering ideology. However, delivering any ideology, intentional or not, is not a good thing as I find it propaganda and all known -isms are by default dangerous if gotten strong enough. If brought sufficiently strongly of course, that might make it worth a curiosity akin to Main Kampf, which I’m expecting to have no other value really. Make no mistake though, there is nothing akin to that in this discourse in any book referred to, alas, it is a question of a theory.

Friday, May 3, 2019

From video editor's journal

There's been some silence on my video front, and it's not just because of Amiga references of previous posts and being much time busy with my family and paid work. Nevertheless, I do have several plans going on and got some work started too. Could I suppose divide this post to two parts related to video making, out of which first part is more strategic and another more tactical. In a way.

So first of all I have plans for videos about the following topics regarding to Zacharian Computer Detective episodes:
- Apple Macintosh video (in post-production: editing, most of the episode raw video exists)
- ICL MikroMikko Pentium video (in production: part of the episode on raw video)
- Nintendo 64 video (in (pre) production: parts on video, much undone, however)
- Atari 8-bit video (in pre-production: still planning what exactly to include)
- Brother digital typewriter video (in pre-production: still planning)
- NEC PC-88/98 video (in pre-production: I'll need quite a few things to get either of these properly running, but on the other hand might start them as a series starting with ordering them etc., as it was not so simple to even get them in the first place)
- Amiga content exploration from old disks (in pre-production: needs to get a session arranged when has time).

There are also some other topics that I have postponed at least for now, even while there might be some related work done already:
- Nokia Data MikroMikko 386 computer (got some video, not sure how to fix it yet)
- Schneider portable 386 computer (might go to pre-production at some point)
- Windows 98 computer circa 2000 type (could start working with easily, but pending)
- IBM AT motherboard to a working computer (missing casing and time)
- Locked Thinkpad motherboard (lacking time for the operation, also might lack some parts)
- Various A/V mixer devices (a bit off topic, but maybe later)
- Several Lenovo/IBM Thinkpads
- Unpacking of everything (might get these to some point, got lots of video material existing but there is already too much unpacking videos out there)
- Actually quite a bit smaller and bigger other things that might connect to some other stuff as well.

Whoah, that already made quite a list without even really trying. It'd take not so much effort to think of more stuff I'd have more or less pending for working, yet it starts to seem obvious that unless if I'll get some extra hand available or something else happens (such as needing less time with my work), I'll have a long time to work with even these things. I've stopped obtaining new computer equipment that doesn't have quite direct need in some context - both because started to run low on space to keep stuff and because will have hard time finding time to deal with already existing stuff.

Obviously also if I'll keep making stuff on videos, it will take me a lot longer to deal with the equipment than if I'll simply do the fix on the computer itself. That is why I'm not documenting every computer operation on video I do at home, such as pimping up some old laptops (from 2007-2009) to be cabable for some modern use with other people who are living with me.

When turning into tactical side, I have some potentially very good news after some bad news. They are related. Bad news is that I more or less got enough of Kdenlive. New version of Kdenlive (19.04.0) was released like couple weeks ago. My previous version was 18.12.1b from like couple months earlier, which is now labelled as unsupported legacy version, and with Ubuntu there are only Appimages available for use. Shouldn't this be a good news then though, as there seems to be lots of good new features and fixes in it and more are to come?

Well for start the new version refused to run practically any of my old projects refused to start up with the new version. This is not essentially critical, mainly because I've started such workflow with Kdenlive that I'll do one episode only and with one version, then I don't need to return to the old project since I'm doing a series. Kdenlive had been doing this on many earlier versions already, so I was sort of prepared with a workaround. I'm suspecting that the main reason is that Speed effect has been completely reworked, so the old version Speed effect (which I've used a lot, as there are some excessively long videos of disassemblies etc.) probably simply causes crash as it's no longer in the new version at all. Annoyance of course, since I can't keep long-term things or pre-prepared intro/outro sections anymore, but I can live with this.

However, when I then started a completely new project clean off, the application soon crashed. When I'll just hit Create Folder (for new clips), the new version of Kdenlive crashes abruptly. No questions asked, there is nothing to see, go home people. Well right... I could still live without making any folders, and it might work for instance by having folders done separately without doing ANYTHING else with the project...but no. I've had enough of this crap.

For around two years I've tried to live with Kdenlive, and I've managed somehow. All the time the application crashes here and there, certain features will break irrepairably the whole project and various features in the application slow down my workflow considerably and/or are just inconvenient in my opinion. Yet I have eventually managed to find workarounds, to know which places I should not to touch and to make backups frequently in such fashion that I won't overwrite the old one. Kdenlive has a lot of potential and several nice features. However, it is just terrible to work with in a long run. Many features simply won't work - not merely not working in fashion desired, but some things just are flat out non-functional or cause sudden crashes. Then when a new version appears, something might get fixed and something new usable might get added, but usually something else gets broken in trade. They've been reporting they've been doing quite a while of project refactoring, and supposedly it is now pretty much done with the version 19.04...and it still doesn't work.

Frankly, I have lost faith with the project progress with Kdenlive. It will keep developing, but I don't see any reason why it would not keep developing in rather jagged and irregular fashion. Updates will surely keep popping every now and then, and supposedly it keeps getting better and so on step by step (even with the steps back in between). However, probably also as an open source project there will keep being several temporary people assigned to the project, who will do some parts that some other people will not know about. These random branches in the project might get hanging until they are broken by some other parts later on. I don't really know if it goes exactly like that, since I've not participated, but I'm expecting the situation to be something like that.

Making a proper dual screen setup on Kdenlive was pain in the donkey.
With DaVinci Resolve it's no problem, and the layout won't get badly messed up by every startup.

So in the other words, I just don't believe Kdenlive will become sufficiently "done" for my taste in a long time if ever. I'm also weary on the fact that large sections won't work, the application is bloated and unstable and that I'll constantly need to give up with project files if I wish to update my crippled editor. I have even a feature film unedited, and I simply dared not to even try starting it with Kdenlive.

On the bright side, however, I noticed that DaVinci Resolve (DR for now on) had just gotten a new version 16 Beta almost the same time. I've ranted about this before in Facebook, and when I first tried DR in version 14, it was not possible to have audio out at all with my hardware under Linux. Version 15 last year provided the long waited audio support, but since DR would have very limited support to video and audio codecs (especially on Linux free version), I felt it's better to simply stick with Kdenlive for now. Especially as I was unable to get DR properly running out of the box.

I'm not sure if there are much real difference between versions 15 and 16, especially related to runnability on Linux, but I figured to give it a try again. After getting CUDA installed and confirmed that my NVIDIA drivers are as new as I can find (version 418.39), I was able to run DR without it getting stuck early on. I also figured that converting my videos to DR supported DNxHR codec with FFMPEG batch commands wouldn't be too bad, as I can just hit them running on background while doing something else. Obviously this converting also brings some advantage in editing, since DNxHR codec is more pleasant to process with than say H.264 (a codec which my main camera outputs), although in some cases for instance clips taken as desktop screen capture the video size might get 100 times larger (from 200MB to 20 GB in one case) if I desire to keep with the image quality.

Therefore, I got DR running without problems. And gosh, actually it felt like having a decent video editing software running on my computers for the first time for almost 10 years. I used to work with old Adobe Premiere Pro versions, but I had to move on because I'm no longer using Windows as my main computer and as my old version would not have proper support for FullHD videos (and tbh the CS2 from 2005 started to feel like lackluster in its features and usability already, that would have not bothered much before 2010). Unfortunately, then I found out that it's not a trivial task to find a proper Linux based video editor - I mean even if planning to buy something.

I haven't yet used DR for much, but the first impression after nightmares in Kdenlive felt almost like paradise. The interface seemed clear and easy to use (a video editing beginner might disagree though; there are lots of features around) and I could add, place, preview and adjust the videos just as I wanted without much effort. In addition, everything felt really smooth - stacking few videos or adding some random effects seemed no problems, while Kdenlive started to stagger if any single effect was in effect or any more than one track would be on adjacent tracks. I have noticed a bit weird fashion of acting while unconnecting audio tracks from videos (tracks jumping to the beginning and crushing old clips beneath), which seems like a bug, but I can live with that plus UNDO feature actually works no problem (unlike in Kdenlive quite often). Unlike Kdenlive, DR seems very stable too - although I needed quite a bit of tinkering to get the system into that state really.

This saying, I see huge potential with DR on my work, and it feels like DR might prove out becoming the same for video editing as Unity 3D was for game development when it become free to use. If no major unpleasant surprises will appear soon, I think I shall have a lot more pleasant video editing moments to come. Maybe simple videos will require a bit less effort to finish after all, and maybe I even will dare to finally start that old feature film editing as well yet with DR... Stay tuna heads, I mean tuned!

Wednesday, April 17, 2019

Go tech, Amiga!...and other digressions

Normal external FDD, Gotek modded external drive and Edirol FA-66 doing acrobatics.
Recently it has again taken me a while to progress with the videos as so much time of my so little free time has gone with Amiga.  After all, since I don't exactly have any fanbase, my retrocomputing hobby (videos included) is quite exclusively for my own amusement. Of course, should I have actual followers, the situation might change.

Then again, nowadays it's also clearly harder to get any visibility in web without tricks or money. This blog for instance can't really even be found by googling unless it's been already pre-customized to search by multiple visits on the site like I have myself. In many platforms I have noticed that the free postings will very easily sink with the stream, and only few of the many possible visitors will even get notified of the opportunity. If an example is wanted, Facebook being perhaps the most well known name where such stuff happens - I don't see most of my contacts' interesting updates without specifically going to look their pages and likewise other people whom something I write might concern won't often see. Of course I have never been very fond of Facebook anyway, and even then in many ways that site is long past its heyday in my opinion.

Various popular social media celebrities have also admitted that they have more or less agreed upon having liking circles with other popular posters - when you leave your virtual bookmark on page of another popular poster, the algorithms will note them more likely and boost up visibility of them both. Manipulating popularity lists has of course been happening pretty much always when it just has been possible. Back then when MySpace was still used, it was a common convention to post something on the pages of other bands, as that would bring visibility to yourself too. Although then I believe it was more primitive as it simply left a link to your name to more places, and there were no similar "news streams" as there are nowadays. Already back in 1960's it's told how they made Jimi Hendrix's first single Hey Joe on charts by hiring people to buy the record.

In whatever case, I'm digressing a bit now, as usual. Yet let that be happened, especially since this posting was not meant to be very stricly about just one thing anyway. Yes, I know such behaviour is not a recommended fashion of writing stuff, as it tends to confuse the readers. In "diary" texts and letters, however, I prefer  to rather give representations of my mental activity than to write a coherent text.

So let me return to Amiga for a moment. I remember how Niko Nirvi remarked in the most significant Finnish computer gaming magazine Pelit in 1990's something like Amiga would need to be missed only if likes (cute) platformers and action games, and he was never really into those. He was understood to be an Atari ST guy before moving to IBM PC compatibles, and on ST Dungeon Master apparently was his favourite game back in the days (for the record, Dungeon Master on ST appeared in 1987, for Amiga in 1988 and for PC in 1992). I agreed myself in the last post how rationally thinking there actually aren't that many reasons to obtain Amiga specifically if wants to enjoy some old games, as most games actually worth still playing would be also available on other platforms and the Amiga version is not often even the best port.
The box opened: FDD on top and Gotek under it.

Gotek is more visible now as intermediate
inner layer has been taken away.

Therefore I'll openly admit that many games I've played recently I've played largely due letting myself indulged with nostalgia - it's been a fun experience to play several games that I had played as a kid, yet I would not likely care to play them really without having played them as a kid. 25+ years of break has also affected to my gaming skills - it took me more than a couple of tries to reach to the last level of Batman the Movie, but it didn't take me long to reach way further in evolution in Eco than I'd ever managed as a kid. This is saying I'm obviously beating 10-1 my childhood self in cognitive sense, but senso-motorical skills/memories are not quite as sharp anymore. I'm not sure if my actual reflexes and fine motorics would be any worse, might even be the opposite still, but lack of practice has caused me need to rub some rust out from my bodily cogwheels.

Aside from playing and checking some old software on Amiga, I have also been improving my related hardware setup. As already mentioned in the previous post, I'm using Gotek floppy emulator for running my software on Amiga (does someone actually still rely mainly upon genuine floppies if actively using Amiga?). What I did not mention, though, is I've made certain arrangements so that I could also use real floppies when needed. There is visible on the previous post's photo next to my Amiga both an external FDD but also a brown cardboard box. The Gotek is in the box. But why?

Well first of all, I didn't want to install Gotek inside the Amiga case like so many do, because I have need to also check what some old disks contain. I still have a lot of genuine Amiga floppies inherited from my big brother, and some of the demos included are not necessarily available anywhere else, so I'm intending to save what is left to be saved anymore. I might have mentioned this somewhere before. Therefore I want to have both Gotek and FDD set up in such fashion that I can use either when needed.

Ribbon tongue from the mouth of Amiga.
Micromys V4 adapter for PS/2 mouse on the background.
Previously I arranged this by simply having the floppy cables stick out from Amiga diskette port, which allowed me to change cables between FDD and Gotek when needed.    This caused certain inconvenience though, because then I needed to have both Gotek and FDD just laying around somewhere on the desk (or put another aside every time when not used). Also FDD was prone to either cause noisy resonation while placed directly on the desk and wrong types of items might have poked the disk rolling mechanics that were now partially on open from bottom of the drive. That made me think of a case, so I prototyped a casing for both of them of a cardboard box. Happened to find a headphones store packaging quite fit for the purpose, as it had readily an inner layer that could be easily removed - therefore I could place both drives on sort of two levels, if I'll just cut holes for cables. Now the drives would be both more conveniently packed and dust covered while still available for use. This also removed the FDD resonation/mechanical hinderance issue. I made some holes on the box/inner level bottoms from where I could either stick some needles or bolts that would keep the drives sufficiently firmly on spot inside the box.

Another thing I realized is I could actually connect both drives simultaneously if I'll just swap the cable. I modified one old PC floppy cable which had connectors for two drives to function with Amiga; for PC disk drive there is a twist in the ribbon cable for A drive which needs to be straightened up so that it works for Amiga - the B drive connector in the same cable is readily usable for Amiga floppy disk drive. Now I had a disk cable to which I could connect two Amiga drives straight on.

Wait a minute, someone should exclaim now, you can not have two disk drives running on Amiga like that simultaneously! That is correct, I can not do that - additional simultaneously used disk drives must be connected from the external FDD port, and the system will get messed up, if there are two drives chained and powered. However, I'm not intending to power up both the same time. This means I can have the ribbon cable connected to two drives and just swap the power when I want to swap from Gotek to FDD. A lot less effort! If there are two drives connected in the ribbon cable, it doesn't matter if the other one is not powered up (although it does look a bit spooky when the Gotek OLED display will lit up just from ribbon cable signals when using the FDD in this setup).

Unfortunately the prototype setup doesn't yet work fully as intended. Because my desk setup with Amiga hiding under a small monitor table is so cramped, the ribbon cable is barely long enough to keep both FDD and Gotek connected the same time. Also the box itself makes it a bit effortsome to deal with Gotek USB port if I'll need to load some new images on the drive for instance. Despite these issues, I find my setup a clear improvement over earlier settings.

Regarding to using Gotek, I referred in the previous post how swapping Gotek floppy images in multidisk games is sometimes somewhat effortsome, even while it's not as bad as dealing with genuine disks. After the post I came to ponder about it a bit more. I happened to have two old external floppy drives, out of which another I got basically for free as it was stated as broken by the previous owner. I figured that inside the external floppy drive case is most likely just a normal FDD connected to an adapter board for making it usable via the external floppy connector, and opening the case showed it to be exactly as I had anticipated.

This meant that in case the broken external drive would be broken by the drive itself, I could simply swap the drive with another Gotek, which I happened to have purchased several months earlier. And how do you do, that just worked out fine in the end! Found out that it was the drive broken somehow (it rolls the disks but does not find anything - probably fixable fault, but might be currently beyond my skill and knowledge to repair it), and a replacement Gotek would fit just fine. Setting it up didn't go fully as expected though, as after reconnecting everything I had hard time to read the newly set USB pendrive for the second Gotek and even running the Amiga itself at all.

Finally I found out first that certain cables had been connected incorrectly (the adapter inside was "upside down", and there were no normal indication of anything in which way the cables should be connected). Secondly I found out that for whatever reason the first Gotek drive will cease to function if the external Gotek is connected with power off (!!!). So now the first Gotek only functions, while the second Gotek is connected too, in case both are powered up. Go figure.

Therefore with the strength of two Gotek drives some Amiga games are a lot more convenient to play with. Tested for instance Who Framed Roger Rabbit, which as a kid was a bit painful due several disk swaps while loading the game for startup (despite being on only two disks) - especially as after game over the game needs to be loaded all the way from booting again... An experience a bit similar to Commodore 64 multiload casette games... Not too fancy game really though, and actually very short albeit difficult, yet it has some very fancy cartoon graphics. Another example of increased convenience is Wings, where every mission start and end would require a disk swap - now just needs to wait for the load.

Nothing is like a dream with these systems of course, and it is not necessarily super convenient to actually set up all the disk images on both two USB drives though. Even more grey hair could potentially cause the fact that I don't have another OLED display, so all the information Gotek gives itself is a number of the selected floppy image... WELL, I can live with an external list of disk images set - at least for now.

Yet another arrangement I've done is to have Amiga audio sound better. I happen to have Behringer Truth B2030 monitor speakers that provide pretty nice audio output. Downside is that I can not usually simply plug them in on any computer or else directly, as that tends to cause a lot of interference - in addition to the fact that there are no straight 6.35 mm audio jack plugins in any standard computer internal audio interfaces.

So I'll need some sort of a mixer or intermediate amplifier in between. Coincidentally my old Edirol FA-66 Firewire audio interface happens to have very conveniently two RCA inputs that I can connect Amiga to as direct monitoring mode. Plays just nicely now, as long as not keeping the Edirol or cables on top of the Commodore 1084 CRT monitor to take too much interference. I used to have Amiga sounds played from this monitor's internal speakers, but the quality is not too fancy and it's only mono.

If you wonder what are the two audio jacks going in the Edirol's main inputs on front though, that is my main desktop at the moment. Funnily enough, I found out due software issues (there is no official support for such Windows XP era device on modern operating systems) that it is actually easier for me in most cases to use the Edirol as a plain mixerbox for speakers instead of a genuine computer audio interface. In addition of slight effort of need to start up a virtual audio server for Edirol during each bootup (yes, this could have been scripted to run automatically of course), I could not get the Edirol output properly directly from computer without lapses or crackling when running some more resource requesting games, so made this workaround. Things won't often work as expected, but usually it's good enough if the outcome is as desired.

Saturday, April 6, 2019

Looming weavers of Amiga

30 years of computing cooperating on my desk.
Back in 1990's, Lucasfilm (LucasArts) and Sierra adventure games were some of my favourite games of them all - especially the ones from late 1980's and early 1990's. In fact I'd even daresay those made the basis of my good English skills as a kid, and that was also one reason I liked them. There were few of their games from the era though that I never managed to obtain myself to play.

Loom by Lucasfilm Games had been until now probably my most significant loss, and only now I also understood the name correctly. As a kid I thought Loom would be just a name - maybe some entity in the game or a character. Then I assumed it'd been about verb to loom - and this was supported by screenshots of the game that seemed to have kind of looming atmosphere there. I suppose it might still be a pun and have this intended looming in the background, but as a direct reference the name Loom refers to a loom in the game. And the protagonist is not a wizard or cultist despite his looks with hooded cape and staff but a weaver from the Guild of Weavers. Weavers with loom, I see, makes sense.

So in any case, since I have had my dear Amiga 500 set up recently and I happened to have Amiga Loom available, I finally decided to give it a shot. In my youth I was somewhat concerned about the gameplay of Loom (I knew about it through reviews), since it was experimental and lacked the traditional verbs as commands. This was few years before icons or mere mouseclicks become commonplace, and I certainly did not complain that Lucas still released few more traditional verb based games in 1990's. In Loom instead the player would simply point and click and occasionally use 4-note tune patterns as spells to interact with the objects.

For what I know, Loom was never quite as successful as many other Lucas adventures, despite it being referred to in many subsequent Lucas adventures. For the genre options from the era it surely is worth playing, however, it is more of a snack than a full meal like most other titles better remembered. A bit like Full Throttle several years later I suppose.

That is, I find Loom fairly easy and short - even by modern standards; I spent few hours with it on my first try to get it completed, and only in two occasions I had to stop for a moment not knowing what should I do. The first one was quite much in the beginning and I had not yet realized that the patterns would be usable in the opposite order: so if ECED would be opening, DECE would be closing. The second case came when I did not notice that there was a place to click in one screen.

This is saying that the game is pretty straight-forward, and part of the easiness comes with the interface: there are not close as many objects in the landscape one can interact with as with most other Lucas games I'd played. The few items you'll simply pick by double clicking, which also is true with talking and other things you can do - and usually there is only one to three things on each room to interact with for only one purpose, in addition to exits and entrances.

Usually the interaction is done by casting a musical pattern or spell which have quite obvious effects in less than difficult to figure situations such as healing an injured or waking up a sleeper. In most cases the game puzzles are just to find which spell to cast where and from where to get the spell and skill to cast it. At least no hassle with items! Generally if there is a spell that might seem logical to cast, it will progress the plot on tracks. There are couple a slightly more twisted puzzles with stairs and dragon, but since in most cases there are all the time only so many rooms you can access, objects to interact and patterns you can weave, it won't take long to even just iterate "try everything everywhere" kind of activity. Also can not die or fail by any action - except by not taking a record of a pattern heard before - there is often no returning to old zones.

On the other hand the narration is quite alright and game-wise not all that standard and stereotypic, where there simply is a great evil fellow that needs to be defeated in order to save the world. I guess there are things similar here too, but that is far from being the actual plot or anything like that. On a yet another hand, the fantasy setting and storytelling gives some very possibly unexpected twists and even fun transitions, but actually in many cases I think they go a bit too fast and a bit too much of Deus Ex Machina style. I mean that there is fairly much of moon logic at least in the events - you'll just do what you can, and that makes you getting dragged to something completely else without looking back really. In addition to gameplay differences it would have felt like more depth also in the storyworld if I would have needed to walk away from the region by my own decision because I think I've seen everything - now it's largely that if it's just possible to move on, there is nothing that could have done before.

The game is also not necessarily very consistent with style of content. I mean that despite looming dark atmosphere in the beginning, the spirit (music included) is rather kid-friendly and maybe partially even too juvenile fairy-tale like. Yet later in the game there are almost horror themes and even violence that might not be fully approrpiate for the youngest of the family, and that was a bit of a surprise (and amusement) to me. Although actually many old games are "inconsistent" with style, and in adventure games especially you can just encounter anything whenever whereever.

As I was earlier speaking of time spent for walkthrough, I reckon I'd spent at least one third less time had I played the game on PC. Testing wrong patterns on Amiga takes a while, since tunes are played with a very peaceful pace on standard Amiga 500 (RAM expansion doesn't matter). Overall much of the animation especially when there are few more things on the screen is quite time consuming from modern impatient point of view. In fact I do remember also back in the time how it was frustrating as in many games there were several scenes where the player just needed to wait till the character walked across the screen without anything to do. Loom even has the "era mandatory" maze, even while at least that is fortunately almost as straight-forward as most of the game.

Speaking of slowness, of course there there are some disk swappings too, but they are not too bad on Gotek floppy emulator where swapping a disk is just a click. It's a bit annoying when needs to do two or even more disk swappings when changing a scene though. Also has to wait for some loadings - I can live with that, and emulated loading sounds have some spirit in them, yet I wouldn't mind having a disk turbo on Amiga should one not hamper with compatibility.

Nevertheless, what is more of a problem is that needs to be quite careful with swapping the disk images on floppy emulator. There is unfortunately no way to write protect the floppy images, and swapping them too abruptly can cause the disk image to get corrupted. This is especially due the way Amiga loads and pauses a bit every time a disk is inserted, and as with Gotek the disk swapping is easy enough it can even be done by accident. So in the end I managed to corrupt my game disks and worst of all my save disk twice - first in the very beginning of testing if I can save in the first place and second time after I had already completed the game. The second time the save disk did not get spoiled completely though, so I didn't lose the saves. The risk for lost records is real with the system here. Always have a backup of your disk images elsewhere but your USB stick when dealing with Gotek, and consider backing up the save images occasionally as well!

Fantastic loom of the game is there.
Those things leads to one sad thing: rationally there are not so many reasons to play quite a few Amiga games, if you just can select any version available. Especially as for most people nowadays Amiga (even emulated) is not the platform most likely available. There are lots of games that are only for Amiga or where the Amiga version is the best version. Unfortunately the latter applies mostly to games from late 1980's, when Amiga was the super home computer in the Europe, while the former applies mostly to games that are not of my favourite genres - various action based stuff.

Playing with Amiga is not a rational but emotional choice, however. I do have lots of nostalgic feelings with this device, as it's the first computer I have really in my life. Still there is also something weirdly fascinating in Amiga and its software. It just doesn't feel like any other computer, apart of course its rival Atari ST, and therefore even titles familiar from other platforms might feel like curious experiences at least in small batches. Not to mention demos... This stuff would already call for their own article, so I'll better get back to having more of some good Amiga time.