In 2018, A PC Game Is Being Made In DOS

MS-DOS, the text-based operating system people used to use before Windows took over the planet, was how PC games were played in ancient times. It’s been dead for decades, but one developer figures 2018 is as good a time as any to make a brand new game for it.

Having raised $US113,640 ($154,283) in a Kickstarter, David Murray is hard at work on Planet X3, a strategy game that’s going to run on an operating system first released in 1981.

Unsurprisingly, he’s running into some problems. In an interview with Gamasutra, he speaks of the challenges in making a brand new game for such an old OS, which run from video memory headaches to troubleshooting gaps.

“You Google [a programming problem], it will pull up some forum that will say ‘here’s how you do that, check this link’”, he says. “You click the link and it’s dead. It’s probably been dead for 10 years.”

Planet X3 is due in May 2019, and is actually a sequel to Planet X2, which was released last year... on C64.


Comments

    Art looks great!

    This is art. So cool what he's doing.

    Looks like its going to take me back to my Dune II days... Lets hope it comes with a PDF manual :)

      Agreed. First thing I thought of was Dune II. The art looks fantastic.

    This guy is insanely talented. Not only does he code across multiple old platforms, but he also writes and performs chiptune-style music, restores old computers, and restores old musical keyboards. Worth checking out his YouTube channels.

    Geez hipster much, why doesn't he just use a modern operating system like everyone else does?

    8-Bit guy is the fucking bomb! glad to see some X3 love here!

    I see what the 8-Bit guy pulls of and it does make me wonder.

    Even though we have a lot of modern tools, etc., I sometimes think new and upcoming programmers actually spend time coding on these old platforms so they get first experience in concepts like memory management, etc.

    In the very least it should give them a better appreciation of what they have now and at least have an idea what is going on as they use libraries and other code.

    Speaking from experience, I never did like C++ pointers (mainly because they just out right owned me) but I still understood their use and I even think of memory when I write my loops and try to make them "self-cleaning", if something is no longer needed it's deallocated there and then, not at the exit of the function.

      @zombiejesus, your thoughts?

      Sorry I forgot to make mention of you in my prior post.

        It can be useful to learn old programming concepts, it certainly helps show why certain things are the way they are (eg. zero-based arrays). C style pointers and references are a bit shit though - they can be very powerful in very limited circumstances, but cause confusion in very many circumstances.

        I'd say it's probably more important to learn what pass-by-value and pass-by-reference mean than how old pointers used to work (and for that matter, what the 'value' being passed actually is, because sometimes the value IS a reference). I feel similarly about memory management - it's fine to understand how it used to be, but since the advent of automatic memory management and garbage collection that knowledge is mostly superfluous unless you're working on embedded systems. In automatic memory management models, you do need to understand when you still have an active pointer to something because GC won't clean that out, but you don't need to know the rest really. And for the best, malloc and memory micromanagement in C was pretty awful.

        Manual deallocation in a managed memory model may be harmful depending on how the model is implemented. I know in Java you should never worry about manual deallocation, it'll actually interfere with the GC calculations to do it yourself. For most modern languages I'd say correctly using block-scoped variables for loops is perfectly sufficient, but there are some languages that are fucky about that (Javascript, you piece of shit, I'm looking at you).

        What language did you have in mind when you were talking about deallocating during loops?

          What language did you have in mind when you were talking about deallocating during loops?

          Nothing fancy, just good ol' C#.

          Granted it has garbage collection, but it doesn't hurt to at least think of the heap and plan with it's behaviour in mind so the garbage collector has an easier time with its own job.

          C style pointers and references are a bit shit though - they can be very powerful in very limited circumstances, but cause confusion in very many circumstances.

          For real? I though it was just me lacking the ability to know when indirection was needed!

          And for the best, malloc and memory micromanagement in C was pretty awful.

          I had forgotten about those calls until now.

          Excuse me while I curl up in a corner and cry myself to sleep.

          [Screams off frame] Ahahahaha, I want my mummy!

            I haven't used C# in a while but I'm pretty sure it's the same as Java in that anything scoped inside the loop is released and recreated each iteration. You do have to be careful of large batch processing in any language though, I've had to rewrite loops quite a bit over the years because they try to operate on enormous datasets with the whole thing loaded in memory at once. Java 8's stream API makes that infinitely simpler, and I'd wager C# has something similar for working with large (or infinite) collections.

          I'm going to say that pretty much every programmer should understand malloc, free and pointers. If someone doesn't understand this stuff, then they don't actually know what is going on under the hood when they're programming, which means they're liable to write inefficient code.

          Automated garbage collection and compiler optimisations shouldn't let programmers off the hook for actually understanding how their code interacts with the underlying system. That leads to sloppy programming and poor performance.

          Last edited 13/07/18 5:54 pm

            Huzzah, your comment was finally approved and I can read it.

            I actually don't agree with this. Understanding how memory works in a given environment is important, but the programming paradigm has moved on from manual memory management. Understanding malloc/free is effectively useless information in Java, C#, Python, Golang, Javascript, etc. because they're not how the environment manages memory any more.

            In almost two decades of experience across a dozen languages, I've fixed far more problems caused by well-meaning devs trying to optimise memory allocation but making things worse because they're fighting against the garbage collector. In languages other than C/C++, I can count on one hand the number of times understanding manual memory allocation has been any benefit whatsoever, and most of those occasions were dealing with embedded systems.

            These shifts happen periodically. There's no meaningful value in learning assembly any more unless you're working directly with it because modern programming has nothing to do with it any more. Often people that do try to second-guess the compiler and end up making things worse.

            Letting the automatic memory management system do its job doesn't lead to sloppy programming or poor performance at all. Understanding memory is important. Understanding how your environment manages memory is important. Understanding manual memory management just isn't, any more.

              I have no idea why I'm in some posting comment hell. Sometimes it's fine, otherwise it's "nope, awaiting moderation for you."

              I write a lot of C/C++, so I'll definitely confess to bias, but I still think it's very valuable to understand some of the nuts and bolts of what's going on under the hood. That said I absolutely agree that manually tweaking garbage collection is probably a bad idea. Unless you really know what you're doing, you may do a lot of work for not much use.

              The classic example I love is a stackoverflow question where someone was iterating through an array on integers and performing different actions based on whether the number was above a certain value. This person noticed that if they sorted the array, they suddenly got a massive performance increase. The sort had been added almost by accident, and they had no clue that the huge gain was entirely due to CPU branch prediction saving them a ton of time.

              It's this kind of deeper understanding of how the computer and its system work that allows programmers to create efficient and elegant solutions. Sure, if someone don't understand them, they can probably still get a decent solution. But if they do understand these things, they've got a significantly better chance of being able to create a better solution in the same timeframe.

                If you're working in C and C++ (I still do for UE4 development), understanding malloc/free is definitely useful since that's the native memory system for those languages. Thankfully stdlib and some custom libraries provide ref counting smart pointers (eg. aforementioned UE4) so as long as you work with those you're mostly free from worrying about the day-to-day parts of it, but it's useful to understand how smart pointers work under the surface.

                It's interesting to note that almost every modern language has some form of automatic memory management and garbage collector. That said, if you're writing certain embedded systems or an operating system, you can't go past C/C++. Horses for courses and all that.

    Kotaku you could have put a tad more effort into this article. Very short, for something so noteworthy.

Join the discussion!

Trending Stories Right Now