My Plight: An Overview from a Novice Programmer

I’ve got Appmageddon breathing down my neck, and every time I turn around there seems to be another page which worked for years and years just fine turning up with a failure of some sort. A new customer isn’t found in the query. The contact search page renders a ‘80020009’ error. I spend three, maybe four days fixing it, only to discover it would’ve been easier and faster to just rebuild the darned thing. I can’t put enough hours together to just redo the entire system, though.

Adding to the frustration, there are about six hundred thousand individual pages to convert. I look at the sheer volume of work and get night terrors. The biggest and best thing I can do for this puppy is put it out of its misery and build an N-tiered application so we can simply add layers to it as we need to.

For those who aren’t savvy computer programmers (me, for one), N-tiered applications segregate the data from the presentation (the “show me the data all pretty” part) with distinct layers, or tiers, of processing. You have your database(s) living happily on a server somewhere, and you have your web page living happily on a server in a different somewhere (even if it’s the same physical machine), and you can get your data into your web page by making calls to a “go get my data” layer in the server. Then, when you do something crazy — like, say, migrate your data from a desktop database client to an enterprise database client, for instance — you only need to make your changes in the “go get my data” layer. All the other layers will automatically receive the right data because “go get my data” knows how to get it and what to get.

And let’s say you constantly need to calculate a date…you know, like the end of the fiscal year, and the start of it, for certain pages. Or maybe you need to calculate the beginning and end of the month. Or you need to cycle through a period of six months prior to the current month. Well, all your date calculations live in a “figure out this date for me” layer, separate from your “go get my data” layer and your “show me the data all pretty” layer. This is technically the “business logic” layer, but many would argue that’s an oxymoron. There are other things the “figure this out for me” layer could do besides dates, too. For instance, you could have it figure out things like how many parts have shipped in a month in dollars. You’d go to the “go get my data” layer and then send the returned data to the “figure this out for me” layer to sum the cost or sales amount for those shipments. The total is returned to you and you can put it in your “show me the data all pretty” part as you need to.

N-tiered applications bring a lot of advantages to the table. They’re easier to maintain. Have a change to your data? Change the “go get my data” layer and the change fixes all the “show me the data all pretty” parts which use it. Need to change how stuff is calculated? Make the change in the “figure this out for me” layer and it propagates throughout the “show me the data all pretty” parts which rely on it. Very simple, very clean, very nice. But we don’t have an N-tiered application in our system.

One drawback to an N-tiered system is, you need to plan for it. You can’t just hurry up and slap it together, because you have to figure out how these things work over the long term. One advantage of being a formally trained developer is, you learn how to plan for N-tiered architecture. And you also learn the languages and tools you need to build it. I am not a formally trained developer. And my predecessor wasn’t either. Both of us are self-taught, and the consequences of that show because we have a 2-tiered system right now. The “go get my data” parts and the “show me the data all pretty” parts and the “figure this out for me” parts are all living happily (or not so happily) together on the pages themselves. So the two layers, in this kind of system, are the data layer (the database(s) themselves) and the “go get the data/figure this out for me/show me the data” parts together in one happy, crappy slop of code-dung.

A further disadvantage of a 2-tiered system is maintenance. When the database(s) change or are renamed or are somehow altered, it’s not easy to update all the stuff relying on that data. Each and every page pointed at the data will break, give you an ugly and unhelpful message, and stop showing the data all pretty. The developer maintaining such a system has to crawl through it and find every instance of that pointer and update it to the new location, on every single page, in every single application pointing to the old location. It’s a maintenance nightmare, to say the least. And it’s something my predecessor either never thought about, didn’t care about, didn’t know about, or wasn’t allowed to address.

I’m not sure why he chose to use a desktop database for his backend. Microsoft doesn’t recommend it, doesn’t support it, and offers a compact and personal version of their SQL Server absolutely free, which they do support and recommend for use in distributed applications. (They want you to use SQL Server and spend the thousands of dollars on licensing and support, but will tell you it’s okay to use the personal/compact version if you must.) A distributed application, for those of you who aren’t savvy computer programmers (me, for one), a distributed application is one which many users can access simultaneously through a network or intranet structure on an enterprise-wide basis. For me, the system I’m supporting is an enterprise system accessed by people nationwide through the intranet.

But it’s a 2-tiered application built on a desktop database backend and it’s my responsibility to make sure it works, and that I scale it up as I need to for use by more and more people in more and more locations.

It’s my dream to one day figure out how to be able to build an N-tiered application with a “go get my data” layer, a “figure this out for me” layer, and a “show me the data all pretty” layer in nice, distinct assemblies which I can call from the individual applications or pages which need to use them. To be honest, I don’t know how to do that. I can make the data come down easy enough, but I can’t figure out to how reference the assemblies doing that in the individual apps or pages needing the data. Once I figure that out, I’m well on my way to developing a system which I can proudly pass to my successor knowing it’s ready, it’s maintainable, and it’s easily scalable.

Until then, prayers for my sanity and job security are appreciated.

Have a great weekend, and I’ll see you on the other side.


2 thoughts on “My Plight: An Overview from a Novice Programmer

  1. Remind me – do users of your system access it through the browser or do you have to maintain your own front-end? I know I’ve asked before, but I forget. I think its through the browser. The rest of my comments will make that assumption.

    That’s correct, the applications I’m complaining about are all ASP (classic) pages, most of them written in VBScript, some with a little JavaScript thrown in. There are also entire pages of JavaScript apps, but not many. Why he decided to get away from that, I don’t know.

    Out of curiosity, how does data get into the access database? There are only about a zillion freely available dbs out there mySQL & postgresql if you like SQL. Tools abound to import/export data. Also, I’m itching for a chance to use MongoDB on something.

    Data gets into the Access databases in a couple of ways (which also needs to be looked at), depending on the database. For some, a nightly text file is FTP’d to us, and the databases or an Excel spreadsheet run a macro to clean the text file up and then import it to the databases. Then a query might execute which will put the data out as a downloadable spreadsheet somewhere. In other cases, the text file is sucked directly into Excel so a macro can clean it up and just save it as a spreadsheet, separate from the import into Access. In some cases, the spreadsheets call other spreadsheets which call databases (and sometimes the databases call spreadsheets) to have this happen. Ideal situation: come up with a Windows service-style application with no UI which runs in the background, like a listener, then when the files arrive, it does the processing in the background and stuffs the data into the appropriate databases and spreadsheets behind the scenes. This way, the applications don’t need to be loaded on the server (they are currently), and the stuff doesn’t run if there’s no data files transferred. As it is now, the macros execute nightly as scheduled tasks. If something goes wrong with one of them, or the files don’t arrive in time or at all, the macros fail. Sometimes they fail gracefully. Other times, not so much, and there’s a cascading effect on the OTHER macros which depend on Excel/Access. There have been days when I’ve had to spend the first four hours of my day repairing and reloading the databases.

    Man, I know it’s kind of late in the game, but I would so make a case to my boss for switching technologies to something that makes it easy to n-tier / MVC / whatever. Yes, it would mean a new language, but ruby, python, and php all have excellent app frameworks(rails, django, cakePHP to name a few) that make separation of code easy. All have tons of resources out there that assist in learning. Plus, I’m pretty sure it’s a matter of installing the right driver, but all can connect to a MS Access database. All have easy to set up servers (PHP more so than the others IMO).

    He’s already tasked me with learning SQL Server. I don’t have to make a case, he knows. I’d LOVE to move this stuff to SQL Compact or Express or whatever it’s called now, but we have to have a way to populate them with our nightly data, and I just don’t know how to do that. HE’S not the hold-up, I am. I guess I could continue to let the Access databases pull the data in and then migrate it later, but that’s redundant and defeats the purpose.

    There’s also stuff out there like ASP.NET MVC that you could check out if you want to stay in microsoft-land.

    We’re using ASP.NET/VB.NET (not C#; learning curve’s too steep right now). I haven’t tried MVC, but the web forms model works fine. It’s MY limitations, not the technology, which is the slow-down.

    Or, if you’re set on doing it in C# or VB.NET, check out OpenRasta.

    We’ve got the .NET Framework up to version 4.0 (we can’t load 4.5 without upgrading the server, which will not happen right now with the budget constraints and spending freezes we have), so I’m good. I don’t need a new framework. I need better knowledge/understanding of HOW to program using what I’ve got (which includes a full paid version of Visual Studio 2010). So, the tools aren’t the problem. I am.

    The good part about using a framework like any of the ones mentioned is that a whole lot of the planning has already been done for you.

    See above.

    Also, consider that you don’t have to swap out all the functionality of the old system at once. If you build a new system, you can add one user function at a time. Then you can set up redirects or links or whatever within the old system pages that go to any completed sections of the new app.

    I have to learn more about redirects and such. That’s the plan, really; for now, I’ve been replacing the old functionality with new stuff whenever some enhancement is required or a failure occurs. But again, I don’t have the skills yet to create an N-tiered application. Believe me, the MOMENT I figure out how to call an outside assembly from inside another, I’m SO gonna have a DAL set up. ALL the pages will simply call the DAL and then I can use LINQ or whatever to do the filtering for the records I want. But for now, it’s still 2-tiered as I stated.

    Anyway, shoot me an email or call me sometime pretty much any afternoon if you want to bat around more ideas for how to continue tackling this.

    Thanks, B, I probably will! I have to stay in Microsoft-land, but I’d love to discuss how to use a DAL and BLL as separate assemblies. If you have insight on those things, that’d be awesome. πŸ™‚

  2. Oh man, that’s a beast of a problem!

    Oh, I’m sure for an accomplished programmer this is a snap. After all, my predecessor wasn’t exactly a Master Systems Architect and he managed to build it from the ground up using inferior technology and methods.

    A long time ago I worked for a large telecom provider in Canada whose database had been sloppily put together initially, and added on to at random for decades. No one single person in the organization knew how the whole thing worked. It was called “SuperSystem”. When SuperSystem went down, sometimes it was down for days. I recall going in to work for three days running, doing nothing but reading library books. I hope they’ve sorted that mess out by now!

    You ALWAYS manage to unhinge my jaw with your accomplishments, Spark! I’d love to be a shadow of the technical person you are! Outstanding! My admiration grows daily. πŸ™‚ Thank you for being such a wonderful source of encouragement and guidance. I value your insights so much.

Hey, what's up? Tell me whatcha think!

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s