Filing the Fillings


Michael H Rich describes how micros can improve dental health

Using any kind of computer in a dental practice neatly divides itself into two compartments: use in the office, which is comparable to using a micro in any small business, and use for clinical records. This latter use involves a far wider concept than ‘ordinary’ business use as the software is highly specialised and, as will be described below, needs the use of combined graphics and text on the screen to be fully effective.

Before the advent of the microcomputer there was very little hard/software available for the dentist to be able to introduce computerisation into a dental practice.

What there was was in the nature of a large ‘mini’, complete with the necessity for an air-conditioned ‘cubicle’ for the CPU which used fixed/removable hard disk cartridges. This, of course, allowed a multiuser facility but in the context of a small dental practice was far too expensive to be cost-effective.

Minicomputers are still available for dental practices; they are smaller in size as well as being slightly cheaper in price, and the suites of software with these systems do a reasonable job of helping the dentist to run his practice. The argument about being cost-effective still applies and thus they are for the larger practice only.

The micros of the Apple/PET/Tandy variety (and this list is by no means exhaustive) have, of course, opened up the world of computerisation for the small business, and it should be realised that a dental practice is precisely that. Many of the available software packages for running such a business can be applied to a dental practice. The management of accounts can be dealt with in a standard manner, as can stock control; although a practice employing half a dozen people hardly needs payroll software!

What distinguishes the dental practice from a small business is the clinical aspect of treating patients and the paperwork that this generates. When examining patients a dentist records the clinical information derived from the teeth in a form consisting of various shapes to designate types of cavity, fillings present, teeth to be extracted, dentures present and a variety of other conditions. This pictorial representation of a mouth is easy to scan and assess and is an internationally standard method. To record this information in written form, although suitable for a standard database software package using routine file handling procedures, would be very long-winded and would mean abandoning the standard procedures used.

There is software available for use on micros which does do this graphic charting of the clinical conditions in a mouth and this is allied with space to write clinical notes of treatment to be done, or which has been done. This is often conjoined with a suite of programs which will price the work done, whether under the NHS or privately, and will produce bills for patients and carry out the usual reconciliation with payments, aged debt analysis and so on. The software will often include a facility for routine recall of patients at a standard time interval and this raises the other major aspect of the application of computerisation of a dental practice – the appointment book.

It is necessary to realise that anything other than the appointment book in a dental practice is capable of being replaced or renewed in the event of a complete disaster, eg, a fire. To take an extreme example, if the premises are totally destroyed one can set up a tent with a telephone line outside the front door and with a list of patients due one can reconstruct records and re-schedule appointments until the premises are fully functional again. Without this book a dentist might as well go home. Consequently a dentist has to consider very carefully whether to commit this vital aspect of his/her practice to an electronic form which may be subject to the vagaries of an irregular power supply, corruption of storage media and the sundry other faults which can occur. To back up one’s records every time a fresh appointment is made or one deleted from the ‘book’ would be counterproductive in terms of time even though it is essential if the possibility of either missing a vacant time slot or double-booking is to be avoided. An actual appointment book can be kept in a fire-proof safe for peace of mind.

In addition to this, the software available at present for this function will only display, at best, one day per VDU screen (some only half a day) per dentist. A good receptionist can keep a visual image in mind of the black spaces in an actual book and can turn a page to ‘bring up’ a whole week at a time much quicker than any software can on a screen.

To go back to the function of computerisation of clinical records, one has to realise that for this to be fully effective there has to be a terminal and screen in each surgery with central mass storage as well as a terminal, etc, at the front desk. This again raises the question of cost: even using micros for only two surgeries and reception on this basis with, say, 10Mb storage will put the cost towards the five-figure mark, which becomes very expensive in the context of a small dental practice. The actual storage figures for dental records with chartings for each patient may be in the range of 500-700 bytes per patient per course of treatment and this multiplied by approximately 3000 patients per dentist gives some idea of the basic storage needed to keep clinical records. Details of treatment have to be kept for at least two years after completing a course of treatment and this, allied with all the other office functions needed, suggests that the 10Mb mentioned above could be a conservative estimate for a practice containing three or more dentists.

The other main problem concerning dentists at the present time is the possible computerisation of the NHS claim form FP17. This is a complex form which has to be filled in accurately so the dentist can be paid by the NHS. It contains details of the patient; name, address, clinical charting grid, a minimum of seven different dates to be filled in and various other details. Software has been written to cope with this so it can be printed out after the data has been put in from the handwritten clinical notes. The problem with this is that the slightest change in the format of the grids, etc, on the FP17 would mean rewriting this software. A suggestion has been made that the central collating body for these forms could use ‘light pens’ to read any printed codes produced by any printer, enabling a dentist to use whatever internal record system is desired. This problem still has to be resolved and will depend on whatever change in method of remuneration of dentists may be applied in the future.

The only other main office function for which a computer is often used and not yet mentioned in connection with a dental practice is the use of word processing. This is not generally a great necessity in a dental practice. Recalling patients every six months is often a feature of a dental software package and would incorporate a print-out (hard copy) format.

In summation, one can state that the small system with a couple of disk drives, screen and printer (not necessarily of letter quality) with a good database software package at about £3000 is a viable proposition for even the single-handed practitioner. The limitation of use to office procedures only is still worthwhile, even solely on the basis of eliminating lots of pieces of paper. Clinical records require considerable mass storage, sophisticated software and even provision in the actual surgeries to accommodate the extra terminals needed.

First published in Personal Computer World, April 1983



Mike Liardet looks at Multiplan – Microsoft’s entry to the spreadsheet fray.

After releasing the Apple version of Visicalc about three years ago, Visicorp enjoyed at least 18 months completely unchallenged in the market for what has now become known as spreadsheet software. But in the last year and a half there has been a steady stream of Visicalc rivals arriving on the scene and, naturally, some of the established companies have been getting involved in this growth area.

Probably the best known of all the micro software companies, Microsoft’s pedigree goes right back to those prehistoric days of ‘core-store’, paper-tape and teletypes – 1975 in fact, when the first of a million microcomputer systems was equipped with a Microsoft Basic interpreter. Now Microsoft has augmented its own spreadsheet system: Multiplan. Will Multiplan further enhance Microsoft’s reputation for excellence? Will it be another Ford Edsel? (You should get this point if you have heard of a Ford Edsel and you definitely will if you haven’t!)

The first thing that strikes you when confronted with a copy of Multiplan is the packaging: Microsoft has obviously invested a lot of effort (and money as well, I am sure) in presenting its ‘new baby’ to maximum advantage. A heavy-duty transparent plastic case holds a substantial ring-bound manual, system disks, various leaflets and a few pieces of carefully positioned cardboard mouldings – simply there to mask out awkward gaps and present an uncluttered appearance through the transparent box. Readers who are concerned by such a flagrant wastage of the world’s resources on a mere piece of marketing-hype will doubtless be relieved to learn that you need not throw the box away after purchase – it readily converts into a sweet little bookstand to support your manual!

Anyway, underneath the packaging we eventually find the disks – my review copy was for the Apple II (DOS 3.3), but Multiplan is also available for The Apple III, CP/M systems and, of course, Microsoft’s MS-DOS. All versions are evidently functionally identical, with just a few pages at the start of the manual outlining any minor differences, so non-Apple owners should still bear with me! (I also had the opportunity to take a quick look at the MSDOS version on a Sirius, so have made occasional references to this, too. In particular, I have included benchmark results for the Sirius version, specifically to check out Multiplan’s performance with a new generation (8088) processor and all that extra memory capacity.)

Getting started

Getting started proved fairly easy – the ‘First Time’ instructions were not on page 1, where I like to see them, but a little bit of page-thumbing soon tracked them down. A bit of disk copying, data disk initialisation, and two or three minutes later I was faced with a reassuringly familiar display of a spreadsheet. The only hold-up in all this was to have a good chuckle at the latest piece of computer jargon, encountered in the instructions for seeking the system for optional (on the Apple) 80-column display mode: ‘Recable’ – to exchange 40-column video cable connection with 80-column!

The initial display is of the top left hand corner of the spreadsheet, showing seven spreadsheet columns and 20 rows, all completely blank. The remainder of the display is devoted to helpful prompts: the names of twenty different ‘commands’, a ‘what to do now’ message and status information, such as percentage of storage space remaining, current cursor position, etc. Both rows and columns are identified by numbers, unlike many systems which use the alphabet for column headings. The repercussions of this are fairly great, since whereas ‘Q99’ is unambiguously a reference to a specified cell, ‘1799’ clearly is not. Multiplan provides several alternatives for identifying cells, but the simplest is that they be written as ‘RyCx’ – eg, ‘R17C99’ – a little bit longer than ‘Q99’!

Moving around

Moving the cursor around the spreadsheet is very simple – single control-key hits (ie. simultaneously pressing ‘Control’ and one other key) move the cursor left, right, up and down, with the VDU screen window being ‘pulled along’ by the cursor if an attempt is made to move to a cell off the edge of the screen. Sensibly, the keys that achieve this movement are arranged in a diamond (on the Sirius the arrow keys are used) – easy to remember and easy to touch-type when you are looking at the screen. Further investigation reveals that there are also control-key hits to ‘home’ the cursor to the top left hand cell and to the bottom-right, and a ‘Go-to’ command where destination coordinates can be typed in, as well as a rapid scrolling facility where the cursor is moved several cells at one go.

Also of particular interest is a very powerful split-screen facility. The screen can be subdivided into display areas (called ‘windows’ in the manual), each displaying different parts of the spreadsheet, and the cursor can be quickly ‘jumped’ from one to the next. There are many possible uses for this: locking row and column headings for continual display, quick movement between different parts of the spreadsheet, and keeping totals or whatever continually in view when other parts of the spreadsheet are being modified. Moreover each window can be displayed with a nice surrounding border, and can also be ‘linked’ to another window so that columns or rows in both always line up correctly. If all this sounds a little confusing to the newcomer, then take heart. You can completely ignore the facility at first, but once you are ready for it, the chances are that however you want to lay-out your display then Multiplan will accommodate you.

Entering data

As with most spreadsheet systems, the ‘bread and butter’ activity centres on entering or changing numbers, titles and formulae. To achieve this, simply move the cursor to the cell to be changed and start typing whatever is required there. The only thing to watch out for is that text entry must be preceded by selecting ‘Alpha’ mode (simply press ‘A’ before typing the text) otherwise the chances are Multiplan will assume you are entering a command – occasionally disastrous. For example, a sensible abbreviation for Total-Costs-Yacht could be ‘TCY’. Enter this without pressing ‘A’ and Multiplan does a ‘Transfer-Clear-Yes’ wiping out the entire spreadsheet! Don’t believe it could happen? A PCW editor (I’ll spare his blushes) did it! Well, it probably wasn’t a yacht, but a yo-yo or a yard-of-ale or something…

The formulae themselves can be built up using a wide range of maths and other functions, including trig, standard deviation, string concatenation, logical and table look-up, etc. The notation used is the classic keyboard version of school maths notation, easily learned by anyone not already familiar with it. As we have already mentioned, formula references to cells require an RyCx’ notation – eg, the formula to add the first 2 cells on the first row could be written as ‘R1C1 + R1C2’. However, there is a little trap lurking for experienced spreadsheet users – the replication facility does no formula adjustment whatsoever. Thus, if the above formula was located at R1C3, and then copied to 99 cells below, each and every copy would be ‘R1C1 + R1C2’ and the expected Column 3 = Column 1 + Column 2 would not be achieved. It turns out that the original formula, quite correct if no replication is envisaged, should be ‘RC[-2| + RC[-1)’, meaning ‘add cell in current row two columns back, to one in current row one column back’. Now, wherever this formula is located, it will add together the two previous values on the row, and in particular, if replicated right down column 3 it will do the column sum correctly.

If typing ‘RC[-2] + RC[-1]’ seems like a bit of a fingerful (tactile equivalent of mouthful) then Multiplan to the rescue! Instead of working out ‘RC[-2]’, etc, simply use cursor moves in mid-formula entry and Multiplan will type in the formula for you. In the above example only the ‘+’ need be entered from the keyboard, the rest of the formula being built up by using the cursor to point to the cells to be referenced.

It is also possible to refer to cells by their row or column name and thus build up formulae like ‘profit = sales – costs’. Since (a) this is immediately comprehensible and (b) always replicates correctly, the extra typing involved is well worth it!

In conclusion, I must say that I did not greatly like Multiplan’s methodology for referencing cells. It should be noted that cell references occur not only in formulae, but are also required by the majority of commands (see below), so a major part of one’s time at the keyboard is spent using them. In fairness I must point out that (a) my previous spreadsheet has been with the Visicalc style of cell-reference and (b) that Multiplan has some compensations for this minor irritation with some excellent other features and facilities.


Thus far, we have looked at Multiplan’s basic essential facilities, but of course there are many other, typically more peripheral (in both senses!), functions needed to provide a comprehensive spreadsheet system. These extra functions are provided for by Multiplan commands, and invoked by selection from a command-menu.

Actually, in passing, we have already touched upon four commands provided by Multiplan – ‘Go-to’ cell, ‘Alpha’ for entering text, ‘Copy’ for replicating cells, and ‘Window’ for the split-screen facility. There are in fact 20 in all, each starting with a different letter of the alphabet, and all permanently displayed at the bottom of the screen. Bearing in mind that there were only six letters of the alphabet to spare, the implementers have done a pretty good job of choosing 20 sensible names – probably the worst one is ‘Alpha’ (it couldn’t be ‘Text’ because that clashes with ‘Transfer’ and ‘Transfer’ couldn’t be ‘File’, ‘Storage’ or ‘Disk’ because F, S and D are in use, etc).

Anyway, in the unlikely event that a command’s meaning is unknown, or in the more probable event that the precise method of usage is unclear, there is an excellent ‘Help’ facility available. Basically the list of command names has its own cursor, which can be shifted along by pushing the space bar. Commands can be selected by moving the command-cursor then pushing ‘Return’ (or by just typing the command’s first letter – much quicker). However, if ‘?’ is hit instead of ‘Return’ the spreadsheet screen is replaced with a ‘help’ screen for the currently indicated command. Moreover the information is not just a few cryptic instructions, but a fairly comprehensive run-down which in some instances extends to several pages. By the way, all the help-screen information is read from disk when needed, and does not affect the precious memory allocation for the spreadsheet itself.

To get some idea of the command facilities available, here is a quick rundown of all 20:

  • Enables text to be entered at the current cursor position.
  • Blanks out one or more cells. Contents are blanked out, but display format assigned to cell is unchanged. Not the same as Delete since, in particular, the following rows or columns are not shifted.
  • Copies cells from one place to another (ie, replication). Relative-copy is not possible (see text above) – must do absolute copy of relative formula!
  • Deletes a row or column of cells, moving all subsequent rows/columns back by one.
  • Instead of correcting a long formula by retyping from scratch, this command can be used to apply the changes quickly.
  • Numerous different display formats are possible: different column widths, centre, left, right justify, scientific, integer, financial, primitive bar graph, and more besides! As an extra convenience, a default format can be specified, assigning the format you most expect to use to all cells not explicitly reformatted to something else.
  • Go to cell specified by its name or coordinates.
  • Gives general help information, not covered by the help-screens, for each specific command.
  • Inserts a blank row or column, moving all subsequent rows/columns along by one.
  • Locks or unlocks specified cells. Can permanently lock all formulae – useful for turnkey systems.
  • Moves a row or column to between two other row/columns.
  • Enables a cell or group of cells to be given a user-supplied name. This name can be used in formulae, and also by the ‘Goto’ command. It saves confusion if the name here is the same as the visible title.
  • Used to set basic operational features, eg, switch off auto-recalculation or audible error beeps. The former is very useful when the spreadsheet is getting fairly full and every change takes several seconds – not to be registered on the screen, but for its effects to permeate through the system. The latter is absolutely priceless if you work at home and your family ‘can’t stand that incessant cheeping’ (to quote my good lady).
  • Can print to printer or disk file. Option to print the formulae as well as the calculated values. This is useful for documenting or debugging the model. It’s also possible to print selected areas.
  • Finish – back to resident operating system (eg, CP/M, MS-DOS, etc).
  • Sorts calculated or entered numbers or text by suitably shuffling rows.
  • Load, save, delete and other disk file operations. Of particular note: Multiplan can read Visicalc data files, or read/write files in a well-documented external interchange format, as well as using its own internal disk format. As it can also print to disk, it is extremely versatile in its file-handling.
  • Can optionally be used for entering formulae or numbers.
  • Split screen facility.
  • Used to read in answers calculated by one spreadsheet as raw input data for another. Can be used for ‘consolidation’.


The documentation is comprehensive, clear and well-written. The bulk of it is in a stout ring-bound manual (minor niggle – the rings are not circular and tend to snag the pages when you are turning them quickly). It has obviously been put together with the sort of thoroughness we would expect from Microsoft, right from the Contents page at the front to the Index at the back. The basic material provided is:

  • System-specific instructions. How to create your working disks under your particular operating system.
  • Organised as seven lessons. Gives you key by key instructions, starting with simple cursor moves in lesson one through to multiple work-sheets at the end. Well illustrated.
  • In alphabetical order, everything you need to know about the command, key-strokes and formula-functions. Also includes a list of all system messages, together with advice on what to do when you encounter them.
  • Extra helpful information, including a glossary and notes for Visicalc experts – a nice touch!
  • Quick Reference Guide. A separate pocket book (16 pages), being a condensation of the reference section in the main manual.
  • Help Screens. Comprehensive instructions on-screen for every command and a few of the other facilities.
  • With this breadth of documentation, there should be something to please all levels of user. Complete beginners can try the tutorial. Experts will probably just use the quick reference guide or help-screens and everyone can make good use of the comprehensive index.

Sirius slip-up

Having given the Apple version a thorough work-over, I arranged a joyride on somebody else’s Sirius. The article was nearly complete – I just needed to pencil in the Sirius Benchmark times and then off to Mustique for yet another three weeks.

First problem: Sirius version of Multiplan manual temporarily mislaid. Well, I should know the system well enough by now. So, in preparation for Benchmark 1, I quickly set up the first 12 columns by 200 rows of the spreadsheet. (Readers familiar with the benchtests will know that this results in a display of 1.. 12 in the first row, 13. . 24 in the second, etc.)

Next I needed to set up column 13, each cell in it being the sum of the previous 12 in the row. Easy! Just use the row-sum function in column 13 of row 1, and then copy it down to all cells below it. Unfortunately I couldn’t remember the correct syntax for using it. Anyway, after experimentation I found that ‘SUM(C1:C12)’ at least did not give a formula error message, but it did seem to be displaying the wrong answer. Okay – time to copy it. Well, much disk-whirring and clanking, then watch the calculation count-down on the VDU display. 45 minutes later; I’m still waiting and the disk is still whirring and clanking and countdown’s still not finished – I’m frightened to switch off in case I corrupt the disk (it’s not mine, anyway) – can’t stop it at the keyboard, etc. Anyway it took about 50 frustrating minutes.

So, what went wrong? Well, basically a minor slip-up in my use of the SUM formula. I eventually got it right (by using a help-screen, what else?): ‘SUM(RC[-12]:RC[-1])’ and the whole test was over in under a minute. The formula I had originally used did not add the row up, but calculated the whole 12 x 200 array of numbers, and of course this formula was then copied 200 times down the column – a bit of a hefty number-crunch!

Anyway, the moral of this story is: make a good effort to learn Multiplan’s cell referencing – it could save you a long wait!


We have taken a fairly fast swoop right through the major facilities and features of Multiplan; so fast that some very valuable features, not generally available in mere state-of-the-art spreadsheet systems, may have gone unnoticed. Just for the record.

Multiplan gives you:

  • If you need to sort columns of figures or text then it is impossible to do this without a ‘Sort’ command.
  • Multiple worksheets. Results from one worksheet can be communicated to another, useful for consolidation.
  • Multiple split-screens. Very flexible facility to design VDU screen display of spreadsheet.
  • Flexible file handling. In particular data interchange with other software is feasible, and Visicalc data files can be read (but not written! – no doubt Microsoft doesn’t want to encourage users to migrate that way!).
  • Available on 16-bit microprocessor (8088/6). The new 16-bit processors can handle a lot more memory, and spreadsheet systems which have been properly installed on them can use this extra memory for setting up bigger spreadsheets (see Benchmarks).
  • Comprehensive help-screens. In addition to these. Multiplan also provides more mundane, but by no means universally available, facilities – such as cell references by names, formula protection, formula printout, print to disk and formula editing.

Certainly Multiplan has a lot of facilities to offer, but what is it like to use? Well some minor complaints here: the row/column numbering scheme increases the amount of typing for formulae. You have to consider replication consequences when you enter a formula, rather than when you do the replication, you have to choose the ‘Alpha’ command before you enter text (okay, it’s only one extra character, but most other spreadsheet systems don’t do it this way). To balance these minor grumbles are comprehensive error messages, and understandable prompts for all input.

So finally, my advice to spreadsheetless owners of Apples, CP/M or MS-DOS systems, or to anyone looking for an upgrade: put it near the top of your list!

Benchmarks and other measurements

These tests were run on an Apple II system with 64k of RAM (which is in fact mandatory) and an 80-column display card (which is optional). Available space for the spreadsheet itself amounted to 21k. Figures are also included for the Sirius (with 128k of RAM, and theoretically extendable to 800k+), running MS-DOS and allowing greater storage space for the spreadsheet. Where the Sirius figures are different they are appended in parentheses after the Apple figures.

Incidentally, a Sirius retails for around £2500, and the nearest equivalent Apple system (but with lower disk capacity, half the RAM, 8-bit processor) would be around £1750.

  • Spreadsheet size: 63 columns wide by 255 rows.
  • Numeric precision: 14 digits.
  • Max column width: 32 characters.

The benchmark tests are described in ‘Which Spreadsheet’, PCW Feb 1983.

Benchmark 1: (a) max rows accommodated: 95 (235); (b) recalculation time: 60 (55) seconds – ie, 1.5 (4) rows per second: (c) recalculation time: 60 (55) seconds; (d) vertical scrolling: 6 (6) rows per second; horizontal scrolling: 4 (4) columns per second.

Benchmarks 2: max rows of text accommodated: 190 (Sirius not tested).

Benchmark 3: max rows of numbers accommodated: 190 (Sirius not tested).

Price: Around £150.


Documentation: 400+ pages, contents, tutorial, reference, index, quick reference and help-screens. Well-illustrated. Excellent.

User-friendliness: Consistent and easy to use — cell-referencing can be a little tricky!

Error-handling: 20+ error messages. Erroneous calculations (eg, zero-divides) displayed as special error values.

Facilities: Arithmetic and other functions: +, -, *, /, %, string operations, logic, descriptive statistics, trig, logs, look-up and more besides!

Configuration: version tested easily configured for different types of Apple screen.

Graphics: a let-down compared with the other facilities!

Interface to other software: specifically can read Visicalc files, and print to disk. Can also be interfaced to other software using data interchange format (requires programming skills to do this).

Spreadsheet overlays: yes – can do consolidation or merge information into existing spreadsheet.

Turnkey: Apple version is turnkey with all disk formatting, copying, etc, achievable without recourse to Apple DOS.

Insertion, deletion and replication: yes.

Display flexibility: just about everything you could possibly want. Excellent.

Protected cells: yes.

Formula printout: yes.

Formula editing: yes.

Automatic/manual recalculation: yes.

Out of memory: memory left permanently displayed. Recovers correctly when it runs out of memory.

Long jumps: can jump directly to any specified cell.

Sorts, searching and logic: yes.

First published in Personal Computer World magazine, April 1983

Data Management to the Rescue?

Kathy Lang checks out a flexible new CP/M package.

Regular readers will know that many of the packages I’ve reviewed in this series have particular areas of strength that make them well suited to certain areas of data management. This month’s offering, a British package called Rescue, which comes from Microcomputer Business Systems and runs under CP/M, is a general-purpose, menu driven data management package which has much in common with others in this field. But it has unusually flexible provision for different types of information, and its data validation is among the best I’ve seen.

Rescue comes in three parts: the first deals with configuring the system for your computer, and is not needed again unless you make major changes. The second part covers the creation and amendment of the data files, and of the screen and report formats, while the third permits record amendment and display. This separation makes it easy to set up a system in which most users have access to the information in the files, but cannot change the format of those files or interfere with any provision for protecting parts of the data for security reasons.

Data is stored in fixed-length records in Rescue, but some ingenious methods are used to keep data storage to a minimum – I’ll say more about that later. Once you’ve set up a record format, you can still add fields to the end of the records, but you can’t change the sizes of existing fields unless you’ve made provision for that in advance. (MBS is apparently about to release an option to permit more radical changes to existing files, but it isn’t available yet). You can access the records in two ways. Individual fields may be used as keys, and any one of them used to access a particular record for display and/or editing. You can also select subsets of the data by setting up a set of selection rules, which are used to extract a set of records for browsing on the screen or for printing. You can set up as many screen and report definitions as you please for any set of data; these definitions need describe only a few fields in a record if necessary, and any or all of these descriptions may be password protected.

Rescue is used through menus, but users can set up their own menus through quite simple procedures. Thus you can set up a series of operations to be activated by one menu option. You can’t at present access one file from another, so that the current version of Rescue does not have true database capabilities.


Figure 1 shows the major constraints imposed by Rescue. The maximum record size of 1024 is the same as several others I’ve reviewed, but Rescue’s dictionary capability makes it more economical of data storage than many.

Some people will find the limitation of 60 characters in a field more serious. I haven’t included in the figure a full list of the field types allowed, as it is very lengthy. Virtually any kind of data format can be expressed with one of the field types provided. I’ll say more about them in the next section.

File creation

The process of file creation is shown in Figure 2, which is a ‘road map’ of all the menus associated with the data definition part of Rescue.

The first stage in file creation involves setting up a data description file, specifying the basic format of each record and the keys it will have. At this stage you must assign a data type to each field. There are four main groups of data alphanumeric, numeric, date, and dictionary. There are several forms of data type in each group; for instance, character data may be just that and contain any valid ASCII character, or they may be alphanumeric, in which case they may only contain letters or digits and any attempt to enter invalid data will be rejected by the system. There is quite a variety of numeric fields, too, including money (sterling). You can specify that a field is to conform to a mask, to ensure that such items as account references, which often have prescribed formats, are entered in a valid form.

Probably the most unusual type of data is the dictionary field, which permits the person entering data to include only certain values. There are two kinds of dictionary field; a short form, which permits up to 29 characters in total to be used for each field, and a long form, which allows up to 255 entries, each of up to 60 characters. The latter are shared among all the fields in the file, so supposing one has a series of questions each with the same range of answers – for example, answers ranging from Poor to Excellent in a market research survey – you only need one dictionary entry for all the fields to refer to. Each response takes up only one character in the record in the data file for either type of dictionary, so the method is really a way of combining coding with captions for codes.

Every field within the record must also fall into one of four entry categories: mandatory (ie, the field must always have a value), optional (the field may be empty), calculated or display-only. Calculated fields are derived from calculations on constants or on other fields in the same record. Display-only fields are provided so that for certain modes of access fields can be shown but not altered – account numbers might for instance be protected in this way. Any field in a record may also be linked to others in a number of ways.

Direct linkage provides for situations where some fields only have values if another field – said to be the controlling field – has a certain value. For instance, details about a property might say if the property were freehold or leasehold but only if it were leasehold would it be sensible to ask for the life of the lease and the annual charge. This approach can also be used to deal with records with lists of information; you might want to store the names of all a person’s children where some people might have as many as six, without asking six questions about childless people. Most packages expect you at least to hit one key for each question when entering data from the keyboard but with the Rescue approach entry can be more finely tuned to stop
prompting for answers if they are not needed.

During file definition you must also specify the fields which are to be used as keys. Rescue treats the key field which is physically nearest to the beginning of the record as the main key, in that you have to ask specifically for other keys when you come to access the file; so it can save a little time to think about what order to store fields in the record. Up to 10 fields may be defined as key fields. Keys may be either unique or duplicate, and Rescue checks when supposedly unique key values are entered. All the key fields are referenced from a single index, which is automatically kept up to date when data is added or amended.

The next step is to define screen and print formats for the records; you can have as many of these as you wish, and each may describe only parts of the record – for instance, to prevent confidential information being seen by everyone. Next, you tell Rescue to set up an empty data file and structure the index file, and finally you construct any custom-defined menus you will need If you do specify more than one screen or report definition, then you will have to do some customisation of the menus in order to use the alternative formats, but this is quite a straightforward process.

Input and editing

The provisions for data validation given by the dictionary facilities, by the variety of data types and by the range checking which can also be set up at file definition time, are extremely powerful – it’s always possible to get the data wrong in a logical sense, but Rescue makes it quite hard to get it wrong in any other sense. That said I did find the mechanics of correcting data a bit clumsy; if you’ve made a mistake and go back to edit a record you can say where in the record you want the editing to begin but from there you must work sequentially through – you can’t work back up the screen either when entering or editing data. Since the program requires you to have a terminal which can move the cursor left and right, it seems a bit strange not to utilise cursor movement up as well, since no terminal is likely to have horizontal movement but not vertical…

When you retrieve records for amendment, you do so by specifying a particular key value; you can specify the use of any key, but you have to get the value of the first four or five characters exactly right (except that Rescue is ‘case-blind’ in this situation, so it will for instance match Smith and smith). Even when matching exactly on a key value you may retrieve more than one record, as duplicate keys are allowed. But searching for field values within ranges is only possible when you want to look at records, not when you want to change them.

Screen display

I said that you can have several definitions for a single file, so that records can be displayed on the screen in different ways for different users or applications. These screen definitions can be created by copying existing definitions and amending them, but I couldn’t find a way to see what definitions I already had except by going out to CP/M and using the Directory command. Screen layout is specified by giving row and column coordinates for each field you want to display, which I found much more difficult to use than the ‘paint-a-screen’ approach which has become fairly common. The coordinate approach also makes it more difficult to amend the layout, though Rescue does have one provision to make this a little easier by letting you specify a re-ordering of the display without changing the absolute coordinates.

The screen layouts are set up in the ‘definition’ part of Rescue. However, they are invoked from the main part of Rescue, through executing one of the options in the menus shown in Figure 3. Display can be of records specified either by matching one key, or by selection using the selection and extraction procedure which is described later.


Rescue uses the same mechanism for printed reports as for screen display, so both are strictly record based. The only provision for aggregated information is totalling of numeric fields. It is possible to force page-breaks when values of particular fields change, but subtotalling is not provided. There is, however, a very flexible facility to interface with Wordstar and Mail/Merge, so it is easy to use them in combination with Rescue to write circular letters and concoct sets of standard paragraphs.


Rescue provides the ability to select parts of the data file for browsing, printing or further selection. The main method of doing this is to set up a set of selection rules in a file, and then to apply these to the data file to produce another file containing the selected records. The selection rules are very flexible: you have all the usual comparison operators (less than/greater than/equal to/not equal to) and data values can be compared with constants or with the values of other fields in the same record. Rules can be combined to provide ANDing and ORing within and between fields, and these combination facilities together with the NOT operator make it possible to select virtually any combination of values you could need. However, personally I don’t like the need to set up rules in a file, as it is rather cumbersome in practice; if you are using the standard facilities menus you must go to the ‘Maintain Rules’ menu (at the third level of menus), create the rules, then go back to the first level of menus and down to the third level ‘Extract and Sort’ menu to actually extract the records you need. Finally (from the same Extract menu) you can display or print the records that have been found. This provides a sharp contrast to the command language approach, in which one command will extract your records and a second at the same level will display them. However, you could tune the menus in Rescue to avoid some of this ponderousness, so it’s better in that sense than menu systems which you can’t adapt.

While actually comparing fields, upper and lower case letters are regarded as equivalent. You can use wild codes: ? will match any one character, * will match one or more characters. For dictionary fields, the order for comparison purposes is the order in the dictionary, so if you have a set of answers with Poor as the first and Excellent as the last Poor will be regarded as ‘less than’ Excellent even though P comes after E in the alphabet. This is usually what you want and with much coded data would be a very valuable feature.


Rescue can sort a data file on up to five fields in one operation; the process is similar to selection and you can also combine selection and sorting to give a sorted extract file. Sorting is either in ascending or descending order, as with selection dictionary fields sort in their dictionary order (Poor before Excellent) rather than in alphabetical or numeric order. In addition ordinary character fields can be given a sort value which is different from their simple alphabetical order. This could be particularly useful where you had fields such as book titles which often have prefix words such as A or The, which you want to ignore for sorting purposes but wish to include as part of the field for printing (In most packages these prefix words must occupy a separate field, which will be empty for titles without a prefix word.)


The calculation facilities in Rescue are quite powerful in the input phase, and practically non-existent after that. When you set up a data definition file, you can specify that a field is to be calculated from constants, or from combinations of other fields (including dictionary fields) in the same record. All the usual arithmetic operators are available. After input the only calculation you can request is totalling on printed reports; this is activated by requesting totalling of a field when a description file is set up. Up to 10 fields in any one description file may be set to be totalled.


Protection in Rescue is of two kinds. It is possible to take the programs used in the Define stage off the run-time disk, so that the ordinary user can use file definitions and screen and report formats, but not amend them. At a more detailed level, password protection can be provided for particular data files, for individual description files (so that a user can be given access only to part of the data in a file) or for particular menu items in custom built menus (so that some users may have access to some functions but not others, while other users have greater facilities, but all within one menu). This is a flexible and powerful scheme, and should provide for most needs.

Stability and reliability

I didn’t have any problems over reliability with my use of Rescue. As to stability, new versions of Rescue, which are ‘cost options’, are intended to be compatible with existing versions. New features in the pipeline include a version for MS-DOS and a multi-user version.


As usual, the first task is to tailor Rescue for your particular terminal. This appeared quite straightforward (although, as is the common bad practice, you can’t be sure the tailoring has worked until you actually run the main Rescue suite). However, I had one misunderstanding which I never managed to sort out; this resulted in repeated prompts being printed on the same line as the error messages, which were thereby overlaid so that I couldn’t read the error message. I wasn’t able to discover whether this was an error in the software, the documentation or my interpretation of them and my Sirius manual, but it hasn’t happened to me before. While tailoring for the terminal, you can tell Rescue about cursor movement left and right but not about which keys move the cursor up and down, so much potential editing flexibility is lost.

Once into Rescue, the main tailoring facility is the ability to set up sequences of activities on custom-defined menus. This gets round some of the inflexibilities associated with menu-driven systems, and I found the approach quite easy to use.

Relations with outside

Rescue can write files in standard ASCII characters, using the ‘comma delimited’ format required by many other packages including specifically Wordstar’s Mail-Merge option. Thus you can set up files of information which you want included in circular letters or standard paragraphs, and then fire them off to Wordstar or another similar package.

Within Rescue you can include on a menu the ability to run another program, so it would be possible to tailor a menu to carry out a selection/printing sequence of this kind, called by Rescue ‘record processing’, without the user having to go back to CP/M. You can’t at the moment read external files of ASCII records into Rescue, though there is a menu option to do this already shown, which I’m told will be implemented in the very near future.

User image: software

Once again, your overall reaction to Rescue will be governed by whether you like menu-driven packages or not. I found the ability to tailor menus to provide facilities oriented to particular requirements a big help in mitigating the inflexibilities of menus. However, most users are likely to follow the well-established principle of ‘satisficing (a word coined by Herbert Simon the psycho-economist to describe the tendency to accept adequate or satisfactory results rather than go for the best possible) and only set up extra menus when they absolutely have to, for instance to access alternative screen layouts. So I suspect that mostly people will use the rather cumbersome standard menu facilities. I also had a rather mixed reaction to the complete separation of description of and access to the data files. Within an organisation which has a database administrator (who might simply be the boss in a small business) this could be a useful separation for security reasons, but it would be less helpful where the same person organises the data files and puts information into them, perhaps in a small office, one person business, etc.

Within the package itself, I as usual found some goodies and some nasties. The progress through the menus was orderly and logical and was made straightforward by the provision of the two ‘road maps’ which I show as Figures 2 and 3. The process of prompting was easy to understand. It would have been even easier if, when a question has a default response, this was displayed before the question is posed – in many cases the default is not shown even after you’ve accepted it unless you go back and edit the record concerned. Allowing the use of identifiable abbreviations, both for field names and for data values, is sensible.

I didn’t like the use of row and column coordinates when formatting screen displays and printed reports, especially as there is no default format so you always have to supply one. The ‘paint-a-screen’ approach is much easier in general than coordinate specification and if this is not supplied then there should at least be a default format with records displayed one field per line starting at the left of the screen or paper. I also found the inability to move back within a record when editing a real nuisance.


The manual is basically a reference document but written in so much detail that it could be used to teach yourself about the package if you were reasonably familiar with data management terminology. However, the amount of detail makes it rather difficult to find your way around. Two goodies help a little in this: the use of emphasis within the text to call the readers attention to the most important parts of each section, and the printing of chapter headings right-aligned on each page (a real help to browsing at a general level). But the chapter names didn’t always make it easy to guess where a particular feature would be described, and since there was neither a detailed table of contents relating to each chapter nor an index, it was very hard to get from ‘now I’ve seen something about that feature somewhere’ to the exact part of the manual in question. Part of the remedy is close at hand, since if the ‘road maps’ (which perform most of the functions of a reference card) were annotated with the numbers of the sections documenting each menu item, readers would find it very much easier to locate the particular piece of information they need fast (As this article went to press, MBS issued an index for the manual, which should help.)

The other problem I had was that while each feature is documented in detail with examples of the particular feature, there are no examples of the display or use of groups of features. For instance, all the features of data entry are described in turn, but there is no figure showing how data definitions are displayed on the screen. Nothing bolsters a user’s confidence like some complete examples shown in real screen pictures!

I can’t resist ending this section by awarding MBS second prize so far in this year’s contest for manual typo errors, with ‘Data Validification’.

Costs and overheads

Rescue costs £295, and is available from MBS. To be realistic, you would need a disk system with the regular double-sided, double-density capacity of 370 Kbytes per drive on a two-drive floppy disk system, to enable you to have all the Rescue software on one disk drive and use the other for data I found the system very slow in loading individual program modules, which seemed to happen whenever I changed from one sub-menu to another. I was told that this was specific to the Sirius-Z80 card method of disk access, but I haven’t noticed the problem with other packages I’ve used. The times for actually running the Benchtests are shown in Figure 4. (Details of the tests were given in PCW December 1982.)


Rescue provides data management facilities through individual files. Data description facilities are very powerful. Rescue provides a variety of data types and validation features more extensive than any I have found before. These features also help to make Rescue much more economical on data storage than is usual in programs which use fixed length records. You can select and sort the data to provide pretty well any required subset but the process is rather cumbersome. Screen and report formats can be varied according to the needs of particular users, which makes it straightforward to protect particular data items; you can also permit users access only to certain Rescue features. Screen and report formats are described in a rather rigid way, and there are no default formats for easy initial use.

On the other hand, the ability to send data to and run Wordstars Mail-Merge option from within Rescue could be very valuable in some environments. Apart from the calculation features on data entry, the only calculating power within the package is the ability to total particular fields. The system is menu-driven, which can be ponderous in use, but you can if you wish design your own menus to mitigate this disadvantage to some extent. Rescue is in the main a single-file system – you cannot reference one file through data values in another. Provided this limitation is not a problem, you would find Rescue worth investigating, particularly if the variety of data types and the extensive data validation would be beneficial in your application.

Fig.1. Constraints  
Max no. files in one menu structure 20
Max file size CP/M limit or disk size, whichever is smaller
Max no. records 32760
Max size record 1024 characters (but good data compression methods)
Max no. fields 100
Max field size 60 characters, 14 digits
Max no. keyfields 10
Field types See text – several varieties of character, numeric, date (day/month/year), monetary (sterling), dictionary


Fig.2. ‘Roadmap’ of menus


Fig.3. Menu options

Fig.4. Benchmark times
BM1 Time to add 1 new field to each of 1000 records Setup time
BM2 Time to add 50 records interactively Scrolling time
BM3 Time to add 50 records “in a batch” NA
BM4 Time to access 50 records from 1000 sequentially on 25-character field 1 min 20 secs
BM5 Time to access 50 records from 1000 by index on 25-character field NA* (1-3 secs)
BM6 Time to index 1000 records on 25-character field 12 mins
BM7 Time to sort 1000 records on 5-character field 4 mins 10 secs
BM8 Time to calculate on 1 field per record and store result in record NA
BM9 Time to total 3 fields over 1000 records NA yet
BM10 Time to import a file of 1000 records NA yet
Note: NA=Not available. NA*=Not available as tested – key must match exactly.

 First published in Personal Computer magazine, April 1983

A Piece of the Action – The Multi-User Sig/Net

Terry Lang investigates the benefits – and drawbacks – of a shared access system from Shelton Instruments.


Front view of a multi-user system with hub and satellites stacked together.

In building their phenomenal success, microcomputers have had the advantage of needing only to provide an operating system which supports just a single user. This has enabled them to avoid much of the dead weight which encumbers mainframe systems. However, there has always been a need for micro systems to support a small number of simultaneous users – for example in neighbouring offices in a small business. (Such users will always need to share access to common data for business purposes. Sometimes users choose to share peripherals – eg, hard disks or printers – simply to save money, but the economic reasons for this latter type of sharing are likely to weaken as the technology continues to develop.)

Even in a shared microcomputer system, it has generally been economic to provide a separate processor for each user, and thus the spirit of simplicity in the operating system can be maintained. Nonetheless, the administration of the shared data does impose an additional challenge, and it is always interesting to see how this challenge is met.

In this article I will be looking at the way this is tackled by the Sig/net system produced by Shelton Instruments Ltd in North London. During a previous incarnation I was responsible for buying a large number of single-user Sig/net systems, which met all my expectations at that time, and I was keen to see how the multi-user combination would be carried through.



Rear view of multi-user system showing ribbon bus cable and terminal and printer ports.

The original single-user Sig/net is itself based on a ribbon-cable bus which connects together the internal components of Z80 processor and memory board, disk controller board, and communications boards (serial and/or parallel). In developing a multi-user system it was therefore a natural step to extend the bus cable to simply chain on other systems, each supporting a single user by means of a processor and memory board. This is illustrated in Figure 1.


Fig. 1. Modules making up the ‘hub’ and user satellite processors on a multi-user system.

The central or ‘hub’ system with one floppy disk and one hard disk fits in a case of its own. The satellite user systems fit three to a case, and these cases are designed to stack neatly with the ‘hub’ as shown. As many satellite cases as may be needed can be chained on via the bus cable. (I understand a 14-user system is the largest installed so far.)

The basic component boards, with the exception of the new ring-ring bus connector, are all those which have proved very reliable in the original single-user system (Since the company has a considerable background in process control reliability should be something it appreciates.) To my mind the cases do run rather hot but I am told this has not caused problems.

The bus cable runs at a maximum speed somewhat below 1 MHz, not particularly fast but adequate for the purpose, as I shall discuss below. More significantly, it has a maximum length of only a few feet. This is sufficient for stacking the cases as illustrated in the photographs, but does mean that all the processors and disks have to be sited in the same room. Of course the user terminals are connected via standard RS232 serial communications ports, and can thus be located wherever required (using line drivers or modems for the longer distances).

Alternatively, it is also possible to connect a complete satellite to the hub via an RS232 link. This would enable a satellite with its own floppy disk to be placed alongside a user and distant from the hub hardware, but it would mean that access to the files on the hub would be correspondingly slower.

Both the hub and the user satellites use Z80 A processors running at 4 MHz. For the purposes of the standard PCW Benchmark programs, which are entirely processor-bound and make no reference at all to disks, it didn’t matter at all that a multi-user system was involved, since each Benchmark program ran in its own satellite processor plus RAM board, unaffected by the rest of the system. The Benchmark times, with the programs written in Microsoft Interpretive Basic, are given in Figure 2.

These times are as good as one would expect from an equivalent single-user system and illustrate the benefits (or perhaps one should say the lack of drawbacks) of this kind of multi-user sharing. (Of course, where user satellites share access to the common hub filestore, then the user programs will slow each other down – this is discussed in detail below.)

The one-off end-user prices for multi-user and single-user Signet systems are given below. These represent very reasonable value for money. Much of the system is of British manufacture or assembly, which should help price stability. It should be emphasised that in addition to the prices quoted you would require an additional terminal for each user. (Integral screens and keyboards are of course not appropriate to this configuration of centralised hardware. This does permit a range of terminal choice according to need)

An important feature is the ease with which a single-user system can be upgraded to multiuser. The old single-user system simply becomes the hub, with one of the floppy disk drives exchanged for a hard disk. Multi-user satellites are then added as required. If you find a dealer who will give you a reasonable trade-in on the exchanged floppy, then the upgraded system should cost you the same as if you went multi-user straight from the start – a cost-effective upgrade path. Since a satellite case and power supply can be shared between three users, it is most cost-effective to add three users at a time, for a cost of £622 per user (plus terminals, of course).

For those who need such things, other peripheral hardware is also available – eg, graphics drivers, A/D converters, industrial I/O, S100 bus adaptor.


Inside view of case with three user satellite processors and common power supply.

Sharing a hard disk

So much for a single user accessing one file over the McNOS network. As the next step, I looked at the facilities for several users to access different files on one hard disk. McNOS provides for separate users to be identified by distinct system ‘user names’, and each user name is protected by its own password. All files remain private to their owner unless explicitly made public via the appropriate command.

Each user name is provided with both a main directory and with up to 16 subdirectories (just as if the user had 16 separate floppy disk drives) identified by the letters A to P. Thus instead of the traditional CP/M prompt of the form


where A identifies the logged disk drive, in McNOS this becomes


where A identifies the hard disk drive and C the default sub-directory for this user. Whenever the user creates a new file, space for this is taken from wherever it can be found on the drive. Some multi-user systems divide the hard disk up in advance, so that each user has a fixed allocation but whilst this protects other users against an ill-mannered user grabbing more than his share of space, it also means that space allocation has to be fixed in advance. In a well-ordered community, the McNOS approach is much more flexible.

To measure the effect of sharing the one disk. I repeated my Benchmark, with a different file on the hard disk for each of two users. When I ran the program for just one user alone, the execution time was 33 seconds: when I did the same for the second user alone, the time was 54 seconds. This very large difference was due to the different positions of the two files on the disk, thus requiring different amounts of head movement (This is one of the bugbears for would-be designers of benchmarks for disk systems!)

Then to measure the effects of sharing, I set the second user program to loop continuously and timed the program for the first user. With this sharing, the execution time increased from 33 seconds to 205 seconds. This increase is explained partly by the competition for buffer space in the hub, but I suspect largely by the greatly increased disk head movement as the head moved constantly between the two files. This is inevitable for physical reasons under any operating system. Sharing access to one disk is going to have a big impact if a number of file-intensive activities are run at the same time; but this should not be a problem for programs where disk access is only occasional (eg for occasional interactive enquiries).

Sharing a file

However, as I indicated at the beginning of this article, the real reason for a multi-user system is often to provide different users with shared access not just to the same disk, but to the same file at the same time (eg, for stock enquiry and sales order entry from several terminals). But if one program is going to read a record, alter the contents, and finally rewrite that record, then that whole updating process must be indivisible. (For if a second program read the same record at the same time and tried to rewrite its new data at the same time, the two processes would interfere with each other). To overcome this problem of synchronisation, a ‘locking’ mechanism (sometimes called a ‘semaphore’) is required, whereby a process carrying out an update can ‘lock’ the record until the update is complete, and whereby any other process accessing that same record at the same time is automatically held up until the lock is released.

On a mainframe database system it is generally possible to apply a lock to any record in this way. However, this can be rather complex (for example if two adjacent records share the same physical disk sector, then it is also important not to allow two programs to buffer two copies of that same sector at the same time).

In keeping with the spirit of micro systems, McNOS implements a simpler compromise mechanism, by providing one central pool of ‘locks’ stored as 128 bytes in the hub. A user program can set a lock simply by writing to the appropriate byte, and release it again by clearing that byte. It is up to programs which wish to share access to the same data to agree on which locks they are to use and when they are to use them In general the programs will by agreement associate a lock byte with a whole file rather than with an individual record as this avoids the problem of two adjacent records sharing the same buffer. It also avoids the problem of the restricted number of locks (even if a bit rather than a byte is treated as a lock, this still only provides 1024 locks).

McNOS maintains the lock record on the hub as if it were a file (of just one record) called LOCKSTAT. SYS, though this ‘file’ is in fact stored in RAM and never written to disk. A user program which wishes to set a lock simply generates a request to read this record. If the record is returned with byte zero set to non-zero, this indicates that some other process is itself busy setting a lock: the program must then wait and try again later. When the record is returned with byte 0 set to zero, this program may examine the bytes (or bits) it wishes to set and if it is clear to proceed set them and rewrite the record (The reverse process must be followed later to clear the bytes and hence release the locks.)

To measure the impact of this locking mechanism, I next changed the Benchmark program for the first user so that it shared exactly the same data file as the second user. McNOS provides a particularly convenient way of doing this, for it is possible to create in one directory entry a pointer not simply to a file, but rather to another file entry in another directory. Thus all I needed to do was to change the directory entry for the first user so that the original file name now pointed to the data file of the second user. Running the Benchmark for either user alone now took 54 seconds (ie, I was using the ‘slower’ of the two data files as far as disk head movements were concerned). I then changed the Benchmark program itself for the two users, so that each read/write pair was bracketed by a lock and an unlock operation as would be required for sharing the file. Now running the Benchmark for either user alone took 106 seconds – a measure of the overheads of using the locking mechanism.

Finally I ran the programs for the two users simultaneously. This meant that the overheads of the locking mechanism, of buffer sharing in the hub and of competing head movements were now all included resulting in a total execution time of 262 seconds. All of which simply shows that the sharing of data in this way consumes resources (as usual you do not get ‘owt for nowt).

Another important resource is of course software. Just because the operating system provides a locking mechanism does not mean that you can take any CP/M system, run it from two terminals, and neatly share simultaneous data access. This will happen only if the program is explicitly written in the first place to use the locking mechanism. At least two general data management packages are already available which use the McNOS locking mechanism: ‘Superfile’ from SouthData of London (reviewed in PCW January 1983), and ‘aDMS’ from Advanced Systems of Stockport (PCW review shortly).

Multi-user software

Thus in the Signet multi-user configuration we can see hardware which is a simple extension of a single-user system. However, the software extension is not quite so straightforward when moving from a single-user to a multi-user operating system. The need for such a system of course became apparent some considerable time ago. Unfortunately, the first attempts by Digital Research to extend CP/M in this direction ran into a number of difficulties. Therefore Shelton was obliged to look elsewhere, and eventually obtained the McNOS (Micro Network Operating System) system from its originators in the USA. McNOS aims to provide a file store and printer spooling system in the hub processor, plus a CP/M-like environment for each satellite user, and the necessary communications software to link them together. As others have found who have followed the same route, a lot depends on exactly what you mean by ‘CP/M-like’. While a well-behaved program may just use CP/M by calling on it in the approved fashion for any functions it needs to be carried out many other programs also call upon the internal subroutines of CP/M or utilise direct access to its internal data tables.

Indeed, in the early days of CP/M, many programs were forced to employ such dodges in order to work at all. (One well-known package reportedly follows each call to write to a file by a ‘close’ call in order to force the writing of any partially filled buffers; though the file is thus repeatedly closed and never subsequently re-opened, earlier versions of CP/M would still allow the following ‘writes’ to take place.) For such programs any strict implementation of CP/M is sure to stop them running. With additional work by Shelton, these problems were eventually overcome by relaxing the conditions of the CP/M-like environment to permit such dodges to be employed.

In the single-user versions of CP/M such dodges did little harm since, if the worst came to the worst, the user would only upset his own program. In a multi-user situation, however, it must be realised that such dodges, if incorrectly employed by a user program, can upset other users as well. This has to be accepted as the price of making sure that the whole wealth of existing CP/M software will continue to run in the multi-user environment.

Before looking at how disks and files can be shared between several users, I thought I should first check how much delay is introduced into file accesses for a single user with a file which is no longer on his own satellite system, but which is now accessed on the hub through McNOS over the connecting lines. For this purpose I constructed a file of fixed length records, and wrote a simple Basic program which read and then rewrote each record. Records were taken alternately from either end of the file, stepping up from the bottom of the file and down from the top until the two met in the middle, thus ensuring a reasonable spread of disk head movement. To provide a norm for my measurements, I first ran this program in a true single-user standalone CP/M Signet system with floppy disks, and obtained an execution time of 257 seconds. Next I transferred the floppy disk to the hub of the multi- user system and re-ran the program from a satellite. The first thing I noted (cynic that I am) was that the program still ran, and that the floppy format was indeed the same under McNOS as CP/M. Would you now care to guess the execution time running over the network? In fact it was 53 seconds, a reduction of almost 80%! The reason for this of course (and it may be ‘of course’ now, but I confess I didn’t expect it at the time) is that much of the 64K RAM in the hub system can be devoted to file store buffering, thus minimising the number of physical transfers actually needed. (If other users had been running at the same time, they would have taken their own share of these buffers. Where there is competition, McNOS sensibly arranges to keep in its buffers that information which has been most recently accessed.)


Processor/memory card, serial communications card and bus interface to support a single user.

The terminal command language

In the beginning were mainframes, which ran programs in batch mode. Because the user could not direct his program from a terminal but had to think ahead for every likely eventuality, the operating system provided a ‘Job Control Language’ to help in directing the compiling, loading and executing of programs. Some Job Control Languages were so elaborate that they could even be used to solve differential equations (or so rumour had it). Then came the micros and operating systems like CP/M, with very simple commands which could be used from terminals. This command structure could hardly be dignified with the title ‘language’ (even though SUBMIT and XSUB do give the possibility of issuing several commands at once). There does seem a need for a more comprehensive job control language, even on micros, for tailoring packages and giving the user turnkey systems. (Sometimes this is done through a specially written program, or via a general purpose ‘front-end’ package which sits on top of CP/M)

McNOS tackles this situation by providing its own job control language, complete with variables, arithmetic and assignment statements, conditional expressions, and subroutines. All this is of very great power, but at the cost of considerable overheads in processing time. To test this out in a pale imitation of those who solved differential equations with the job control language on mainframes, I coded one of the PCW Benchmarks in the McNOS command language. This ‘program’ is shown in Figure 2. I estimate (since I didn’t feel inclined to wait for the whole 1000 iterations to finish) that this program would have taken over 14,000 seconds to complete (compared with 9.6 seconds in Basic)! Time may not be so critical in more typical job control situations, but it must be possible to do better than this. However you do not need to use it if you don’t need it. It is perfectly possible to stick to a very small subset of the simple commands, which then makes the system very like CP/M. Unfortunately, of course, it can not be exactly like CP/M because it is necessary to maintain a unified underlying syntax capable of supporting the larger language too. As a fairly experienced user of CP/M I must say I had no difficulties with the differences, though they would prevent a novice user from working with a standard CP/M primer as a guide. (I have heard it said that at least one user was so impressed by the McNOS command language that he asked to have it implemented on his single-user CP/M systems as well).


Fig.2. Coding of PCW Benchmark Program 3 in McNOS Terminal Command Language.

Future developments

A user who is just starting on a microcomputer development which requires only one system now but which could expand to become multi-user later, could well choose a Sig/net system for its development potential. If Shelton maintains its record in exploiting its technical expertise, then it would be expected that other developments would be on the way. I understand that one of these developments is the provision of a local area network facility based upon the Datapoint ARCNET approach. This will be used instead of the current ribbon bus to provide high speed communication over much longer distances, and thus permit the siting of user satellite systems away from the central hub. I must point out however that this is not yet an available product or, as Guy Kewney so aptly put it in this same magazine ‘the future is not now…’


The Shelton Sig/net system is based on good hardware and provides good value for money. The system provides a convenient cost effective growth-path for the user who wants to start small but expects to expand to a multi-user system later. The McNOS multi-user operating system provides convenient facilities for users who wish to share data between a number of terminals and a number of CP/M programs, provided this can be done on a scheduled basis (ie, no file being used in update mode by more than one user at a time). It is also possible to share simultaneous update access to the same data files with programs written specifically to take advantage of the McNOS ‘locking’ mechanism. The powerful McNOS terminal command language would be useful in some circumstances, but can be slow to use.

BM1 1.1
BM2 3.4
BM3 9.6
BM4 9.3
BM5 10.0
BM6 18.1
BM7 28.9
BM8* 51.3
Average 16.5
*Full 1,000 cycles  
For a full explanation of Benchmark timings, see PCW November 1982


Prices – Multi-User  
Hub filestore, 1 x 400K floppy  
Hard disk, 5.25Mb (formatted) £2,695
Hard disk, 10.5Mb (formatted) £2,954
Hard disk, 15.75Mb (formatted) £3,195
Hard disk, 21Mb (formatted) £3,500
Satellite case  
1-user (Z80A, 64K RAM, 1 x RS232) £1,100
2-user £1,550
3-user £1,865
Single User  
Z80A, 64K RAM, 2 x RS232  
Floppies 2 x 200K £1,390
Floppies 2 x 400K £1,690
Floppies 2 x 800K £1,890

First published in Personal Computer World magazine, April 1983

Smartmodem 1200 and Smartcom II


Although modem users in the UK will be familiar with Hayes protocols, Hayes’ products are still relatively unknown here. Peter Tootill looks at the Smartmodem 1200 and Smartcom II, a terminal program package for Hayes’ or Hayes-compatible modems.

Hayes is a new name in the UK, but will be familiar to anyone who has come into contact with the US telecomputing scene. Hayes was one of the pioneers of the ‘intelligent’ modem – that is, one that has a built-in microprocessor and can respond to commands from a computer. The command system that Hayes devised to control its modem has been followed by other manufacturers, and now Hayes protocols are the standard for intelligent modems and software designed to be used with them.

The demand for Hayes-compatible modems has come with software such as Symphony with its built-in communications features which allow the user to dial numbers stored in its database files. Hayes is making a concerted effort to push into the UK modem market and is not only working through a distributor, but has set up a separate company run, initially, by US staff. The company’s three introductory products are: a modem (Smartmodem 1200); a terminal program (Smartcom II); and a database package called ‘Please’. In this review I’ll look at the first two.

Smartmodem 1200

The Smartmodem 1200 is a single-standard V22 (1200 bits/sec full duplex) type. It is a very smart-looking modem, measuring only 5.5in x 9.5in x 1.5in, and has an external power supply. The UK version of the very popular US Smartmodem 1200 has lost its 300 bits/sec capability and gained BABT approval. The front panel carries eight LED indicators for transmit and receive data, autoanswer on, terminal ready, modem ready, carrier, off hook and high speed. The last one seems to be a hangover from the US version; it indicates that the modem is working at its highest speed – which in this case is its only speed, so it is rather superfluous. The front panel is removable, giving access to a row of 10 switches which allow you to configure various modem parameters such as auto-answer, how the DTR and CD lines operate, and so on. The rear panel has a standard 25-way, RS232 connector, an on/off switch, a socket for the external power supply, and the telephone cord with a normal BT plug on the end. There is no socket to plug a telephone into, so you will need a double adaptor if you require a handset on the same line. The modem’s case is made from sturdy aluminium and plastic.

Removing a couple of screws allows the modem’s PCB to slide out. Inside, the modem is of the same high quality as the exterior.

The modem supports the basic Hayes command set – it is, of course, Hayes-compatible (not all Hayes-compatible modems are fully Hayes-compatible – just as not all IBM-compatibles are completely compatible) and should work with any Hayes software that doesn’t expect the extended command set of the Smartmodem 2400 (a V22/22bis modem soon to be launched in the UK). It doesn’t have a number store, but as most smart terminal programs have telephone directories built-in, this is not-a significant omission.

The Smartmodem 1200 supports tone and pulse dialling, detects dial tones and engaged tones, and also has a built-in speaker so that you can monitor call progress. The volume of the speaker can be changed under software control by the computer.

In use, I found that the modem worked reliably, although I experienced a small problem when calling a US system: it couldn’t pick up the carrier from the other end – I must admit it is rather faint, but my old British Telecom 4124 modem doesn’t have any difficulty.

The current recommended list price of £575 (excluding VAT) is rather on the high side. It compares reasonably well with single-sided V22 modems from other manufacturers, but when you consider that you could get more features for a similar price with a WS3000 or, for a little more, a Steebek Quattro, it does look a bit steep.

A spokesman for the company stated that the price included a two year warranty and excellent support. He also said that Hayes products in the US have a high second-hand value. However, I wouldn’t be surprised to see the price reduced before too long, as the modem market is in a state of flux at present with new manufacturers coming into it and prices dropping significantly. (The US Smartmodem 1200 is available in the States, from mail order companies, for around $400, and the Smartmodem 2400 for about $600.)

The documentation is clearly presented and is of a high standard. There is also a comprehensive index.

Smartcom II

Smartcom II is a ‘smart’ terminal package designed to be used with Hayes’ intelligent modems. The UK version of the US product, Smartcom II, has all the usual smart terminal features such as up and downloading of files, including XModem protocols. Details of up to 25 systems can be stored and these can be dialled and logged-onto automatically. DEC VT52/100 terminal emulation is also provided, but unfortunately Viewdata is not.

The program is menu-driven, which makes it very straightforward to use. However, as with any system, the menus can be a bit of a burden when you get used to the way the program works, especially as frequent disk accesses are involved.

When you run the program, the main menu is displayed at the top of the screen. One of the entries is highlighted, and underneath is a brief description of what it does. There is a comprehensive built-in ‘Help’ function available by pressing the F2 help key.

The program must be configured to suit your particular system before you are ready to go. Options include the serial port the modem is attached to, a parallel or serial printer, the number of disk drives, and various modem parameters such as tone or pulse dialling, loudspeaker volume and time to wait for a carrier to be detected. You must also edit the parameter sets to suit the systems you want to call – there are 26 in all, one of which cannot be altered; many of the other 25 are already defined when you buy the program. These definitions include a number of systems such as Telecom Gold, Telecom Gold via PSS, Dialog and Nexis. If any of these suit you, all you have to do is edit the phone number and enter your account and password (which can be hidden from prying eyes) in the auto-log-on section.

Going online is simply a matter of selecting T for ‘Begin communication’, followed by ‘O’ for Originate and the letter for the system you wish to call. The software then tells the modem to dial the number. When it is connected it will automatically log you on – marvellous if, like me, you are a bit ham-fisted, or can never remember the PSS code for the system you want to call. I have to admit that, before I started using them, I used to think that auto-dial modems were gilding the lily, but now I am converted. The cursor keys can also be used to step through the menus; the current option is highlighted, and can be selected by pressing ‘Return’.

The auto-log-on feature is just one of 26 macros that can be stored for each of the 25 definable systems. The macro features are very powerful. Up to 48 characters can be transmitted and you can choose the prompt character, the time to wait before assuming the prompt has got lost on the way, and whether or not a carriage return is required after the data. A number of lines of data can be used for each macro.

Once connected to a remote system, all the usual features are available. Prepared messages can be uploaded continuously or line by line. XModem (or Hayes’ own) error checking protocols are available for file transfer. The printer can be toggled on and off by pressing a function key, so can spooling of data to disk. The status line at the bottom tells you how much disk space is left, the state of the printer buffer, the name of the system that you are connected to, and whether the Caps & Num Lock keys are active – a nice touch if, unlike the Philips PC I was using, you don’t have LEDs on the keys.

Another very useful feature is the ability to scroll back through information which has disappeared off the top of the screen. The manual says that the amount of incoming data that can be viewed in this way depends on the amount of memory in your computer. I had 256k and on one test counted over 60 screens full, without filling the data buffer. This is a feature that should be much more widely available on all computer systems, not just on a terminal program like this. Why should data be lost just because it has scrolled off the top of the screen?

As well as macros, you can set up ‘Batch’ files. Each can store up to 500 keystrokes, and if this isn’t enough, they can be chained together. A batch can be set to run at a predetermined time, which is potentially a very powerful feature. You could set up a batch to call a system, log-on, read all your mail into a disk file and log-off again, completely unattended. Unfortunately, batches seem to be a late addition to Smartcom II, and they haven’t received the same care and attention to detail as the rest of the program. The way they are created is by actually going online and recording your keystrokes as you read your mail, for example, so any typing mistakes you make are recorded in the batch. Unlike the parameter sets for systems, batches can’t be edited or copied. Also, there is little provision for dealing with errors induced by line noise. This means that, unless you are sure of a good line, it isn’t advisable to rely on them working unattended. However, they are a very efficient way of reducing connect time when calling systems normally.

Smartcom II also supports remote access using the auto-answer facility of the modem. You can use it from a remote system (also running Smartcom) as if you were sitting at the keyboard of the host machine. A password can be set to prevent unauthorised access, which enables file transfer, viewing of disk directories, and viewing and erasing files. Smartcom II is designed to work with Hayes’, or Hayes-compatible modems. The problem here is that Hayes currently only sells a V22 (1200 bits/sec full duplex) modem in the UK. The software – in its autodial mode – only supports 600, 1200 and 2400 bits/sec. It can be used with 1200/75 systems, with a V23 modem that buffers the 75 bits/sec line to 1200 bits/sec. It needs to be Hayes-compatible, of course, and I had mixed results here. Smartcom recognised a Miracle Technology WS3000, but for some reason the modem wouldn’t respond to the auto-dial commands. A Steebeck Quattro wasn’t even recognised; I just got ‘Smartmodem not responding on COM1’. On paper, the Quattro seems to be virtually 100 per cent compatible with the Hayes Smartmodem 2400 (not yet available over here). I didn’t have time to pursue these problems in detail, and it may be that they could be overcome. However, it is obvious that you should try before you buy, if you want to make full use of a ‘compatible’ modem.

In fact, Smartcom II can be also used with non-Hayes’ and ‘dumb’ modems by choosing the ‘direct connect’ port option. This still enables you to use all the features of the program, apart from auto-dial. It is also possible to use a non-Hayes’, auto-dial modem by setting up one of the macros to issue the dial command. You could even set up a batch file to take you through the log-on process, which would avoid the need for Hayes compatibility.

Nothing is perfect, but Smartcom II has very few things missing. The most obvious omissions are Viewdata and full 300 bits/sec support. Viewdata is one area where Hayes does seem to have misjudged the UK market and it is likely to be added in a future release. 300 bits/sec support for Hayes-compatibles is another matter: I doubt that it will appear unless Hayes introduces a 300-baud modem of its own. There is also no provision for translation tables (to filter or amend the data streams) but control codes can be filtered out. The only thing that ‘niggled’ me was the fact that, although the modem detects the absence of the dial tone when trying to dial, the software doesn’t recognise this feature.

Smartcom II is a very nice package, carefully designed and implemented. The documentation is, for the most part, excellent, with a comprehensive index. Hayes maintains a help line which puts you straight through to people who know enough about the company’s products to be able to answer most questions easily; Hayes in the US has a good reputation for support. Smartcom II runs on the IBM PC and compatibles. Apparently they do have to be pretty compatible – it worked fine on the Philips P3100.

The recommended retail price is £140 (excl VAT) which is reasonably competitive (but I’ve seen it advertised in the US for $70! I don’t know how different the British version is to that one). It comes with vouchers for Telecom Gold, Nexis, Dialog and Knowledge Index.

First published in Personal Computer World magazine, May 1986

Microsoft Excel

Microsoft Excel is a powerful, sophisticated spreadsheet which runs under Windows and has the potential to overtake Lotus 1-2-3 in the popularity stakes. But how do its features compare with those of its established Macintosh relative? Anthony Meier finds out.

Microsoft’s new spreadsheet program, Excel, looks set to leave Lotus 1-2-3 and its lookalikes well behind in the spreadsheet stakes. It promises to be the most powerful and user-friendly spreadsheet written to date. It is being introduced as the third generation spreadsheet for personal computers, and is designed primarily to run on machines based on the 80286 and 80386 microprocessors. Macintosh users will be familiar with this program already, as a version of Excel has been available for this machine for 24 months or so (see the ‘Function comparison’ box for a comparison between the two versions).


Excel is a sophisticated piece of software which offers many advanced spreadsheet facilities and programming features, an integrated onsheet database and a wide range of charting and graphing facilities. It is the first spreadsheet in the MS-DOS environment to offer interactive, dynamic linking of worksheets, a one-step automatic macro recorder and high-resolution output. It runs under Microsoft’s Windows 2.0 and takes full advantage of all its facilities, providing multiple worksheets in overlapping windows onscreen, pull-down menus and full mouse operations.


Excel makes use of the ability of Windows 2 to have a number of spreadsheets open at once. The arrows at the top left size the windows. The most obvious difference from Excel on the Apple Macintosh is the use of colour.

A run-time version of Windows 2.0 is bundled with the program for users without the full version. A version of Excel is also planned for the OS/2 operating system. Windows 2.0 has an identical interface to that of OS/2 with Presentation Manager, so Excel users should find making the transition to that new operating system easy.

The machine I used for the review was a Dell 286 with a 20Mbyte hard disk, EGA card, colour monitor and mouse. I also had an AST card installed which increased the memory from 640K to 2.5Mbytes to give more room for testing large spreadsheets. There is only about 140K available for data on a standard 640K machine.

Installing the program on the hard disk was very simple. It involved inserting the setup disk, typing ‘setup’, and following the instructions given on the screen. These asked for the other disks supplied to be inserted one by one until all the necessary files had been copied across. I was supplied with 14 disks, eight of which contained the files for Excel, the other six were files for run-time Windows.

In use

When the program had been installed and loaded, I found Excel very simple to learn and use. Virtually all of the user-friendly features of the combinations of clicking, double-clicking and dragging.

The mouse can be used to give all the commands and instructions you need in Excel. It saves you from having to learn and type in commands at the keyboard, and makes program operation very fast. You can also keep your eyes on the screen instead of continually glancing at the keyboard. However, keyboard lovers can still use the keyboard instead of the mouse for all the commands and operations they need – even moving and sizing windows. Pressing the ‘Alt’ key makes the menu bar active, then pressing the underlined letter of the menu title you want (or using the cursor key and Return) pulls down that menu. Finally, pressing the underlined letter of the command you want (or using the cursor key and Return) invokes that command. Pressing the ‘Esc’ key cancels the menu selection.

The mouse, however, does make it quick to select a cell, or cells, for data entry – you just move the pointer to the cell you want and click to make it active. You then need to use the keyboard to type your data in. The mouse can also make operations like inserting and deleting rows and columns, and cutting, copying and pasting cells, very fast.

The mouse also comes in handy for entering cell references into formula. Instead of typing in a cell reference, you only need to point and click on the cell in question for its reference to be automatically inserted into the formula. Dragging the pointer across a range of cells inserts that range into the formula. And you can include references to cells on another spreadsheet (linking the spreadsheet) just by clicking on the cells in that other spreadsheet. This saves time setting up formulae and speeds up the creation of models.

Spreadsheet handling with Excel is very impressive. You can have several spreadsheets, charts and macro sheets onscreen at the same time, each one in its own window, like so many pieces of paper. You can shrink or expand the windows, depending on which one you are working on, and you can transfer information easily from one to the other.

Spreadsheets can easily be linked, allowing you to consolidate figures from as many different spreadsheets as desired. Because you can work on many spreadsheets at once, you can see the effects of changes in one worksheet on other linked worksheets immediately on the screen.

Each spreadsheet has a maximum of 16,384 rows by 256 columns, and it is easy to move quickly to any desired location using the mouse on the scroll bars along the sides of each window. Column widths and individual row heights can be adjusted easily with the mouse. Each window has a horizontal and a vertical split bar which you can use to divide the window into a maximum of four panes, to see different parts of a spreadsheet next to each other. You can also open up new windows for the same spreadsheet if this is more convenient. As you are expected to have many windows fighting for space on your screen, there is a window menu which lets you select the window you want to bring to the top of the others.


Excel has all the features and functions you would expect to find in a top spreadsheet package, such as cell protection, calculation options and zero suppression. It has an ‘undo’ feature that can reverse your last command if you make a mistake, and it also has a matching ‘repeat’ feature that you can use to repeat your last command.

Excel only recalculates those cells that have changed since the last calculation, thus speeding calculation. It also uses ‘background’ calculation which lets you continue working while it recalculates. And it doesn’t require you to wait until all the cells have been calculated before you can start working again, which is nice.

Excel has more functions than both Lotus 1-2-3 version 2 and Excel for the Macintosh. The box on page 140 gives a comparison, although functions alone should not be used as a guide to a program’s overall capabilities.

Many of Excel’s functions are similar to those of Lotus 1-2-3, so 1-2-3 users should be able to build spreadsheet models with Excel’s functions without too many problems. Some of the interesting new functions provided by Excel are as follows:

  • The ‘information’ function, CELL(type-of-info, reference), returns information about the formatting, location or contents of the upper left cell in ‘reference’. CELL(“width”, F13), for example, would give you the column width of cell F13. CELL(“format”, B12) would give you information on the cell formatting.
  • The text function, CODE(text), returns the numeric ASCII code of the first character in ‘text’. CODE(“Alphabet”), for instance, would equal 65. CODE(B5) would equal 70, where cell B5 contained the text “February”.
  • Excel can be used for working on arrays, which are groups of two or more values that can be used like a single value in formulae and functions. Excel also has matrix functions which can be used for working with these arrays. The matrix function, MMULT(array1,array2), returns the product of two arrays, where both arrays contain only numbers. This might be written as MMULT(A1:B2,D1:E2).

Compatibility with Lotus 1-2-3

Many of Excel’s new users are expected to be previous Lotus 1-2-3 users, and Microsoft has developed tools and functions within Excel to make learning and using the program easier for these users. The features will also help Excel integrate more easily into a Lotus 1-2-3 environment.

For beginners, two-way file compatibility enables spreadsheets to be exchanged between the two programs. Then there is a useful 1-2-3 macro translator that can automatically convert nearly all 1-2-3 macros into Excel macros. A ‘1-2-3 Help’ facility lets users type in the command sequences they would have used in 1-2-3 and automatically gives them the corresponding Excel commands.

Presentation features


Fonts, type styles and colour can be used to enhance the appearance and logic of a spreadsheet both onscreen and when printed. Debits, for example, could appear in red

Excel’s presentation facilities are very impressive, and provide you with a wide range of screen display and printing options. You can turn the spreadsheet grid on or off, show or hide the row and column headings, switch them between R1C1 and A1 according to your personal preference, and choose between different font types and sizes. You can use up to four different fonts on one worksheet – individual row heights will automatically adjust to accommodate the font sizes you choose. There are 19 number-formatting options which are meant to be used for things like date formats, decimal places, commas and negative brackets.


An Excel worksheet can be as plain or as detailed as you want, with grid lines and headers being optionally shown in various colours and styles.

Individual cells can be emboldened, underlined or italicised. You can add shading, create boxes or lines around cells or blocks of cells, and control screen colours to enhance the appearance of the screen display or printed document. You could have all the positive figures in a column display automatically as blue, and all negative figures red, for instance. All these facilities help you to produce printed documents that rival word processor output and can be used for final reports and presentations.

There is a page preview facility to let you see a miniature version of your page as it will look when printed out, which is very useful for checking pages before printing them. It is also useful for viewing large spreadsheet models like a map to give you a better idea of what they look like.


A wide variety of printers and plotters are supported, and your own printer and plotter drivers can be installed during the ‘SETUP’ procedure. High-resolution graphics printers are required if you want to take advantage of the graphics output of the program – a laser printer would be ideal.

Excel includes a sophisticated printer spooler that lets you queue up print jobs, control the printing operation and continue with your work while they print in the background.



Charts are created by selecting an area of data and then choosing a chart style option. Charts are automatically updated as the data changes.

Excel has sophisticated charting and graphing facilities. A wide range of charts can be summoned instantly from selected spreadsheet cells and will change shape automatically if the cell contents are changed. You can see a chart in one window change as the data in the spreadsheet window alongside it is altered.

To create a chart from data in your spreadsheet, you first need to select the data you want to chart. This can be done by dragging the mouse across the relevant cells to highlight them, then you select the ‘File New’ command and click on the ‘Chart’ option. This creates a new chart window that automatically contains a default-type chart built up from the values in your highlighted cells.

The program has 44 pre-designed chart formats grouped into seven types of charts: area, bar, column, line, pie, scatter and combination. When any of these is created, the program provides default labels and designs. The charts are highly customisable, however, and most of the parameters can be altered to suit your own requirements. You can alter the colours, add text labels and legends, and scale the chart horizontally or vertically to get it to look just the way you want.


Many chart styles are available. The ‘help’ system includes a cross-reference to Multi plan and 1-2-3 commands, so users who know what to do in those programs can transfer across.

Auditing & documenting

Excel has very useful auditing and documenting features. These help you check the logic and formulae in your model, track down errors and discrepancies, and document your model for your own reference and for other users. You can attach notes to any cell and view them using the ‘Show Window Info’ command. This command also shows you other information such as the cells that contain references to your active cell (dependents) and the cells that it refers to (precedents).

You can use the ‘Formula Select Special’ command to highlight all the dependents and precedents in the worksheet for easy identification. You can also automatically find all the cells with notes or those containing a particular formula.

These features are a great help when you are creating or amending a spreadsheet model and when you are checking its logic. They reduce the risk of missing important cells and making errors.

Excel has sophisticated cell-naming features, too. You can name each cell in a block of cells automatically by using a combination of the titles in your row and column headings. You can easily find cell references in a spreadsheet and replace them with names, and you can find cell names.

You can define a name which is not attached to a particular cell, but which refers to a value: ‘INFLATION’, for instance, can be defined to be ‘4%’. Then, whenever you use the name in formulae in your spreadsheet (and in other spreadsheets) it will equal 4%.



Macros can be created line by line or recorded; this allows Excel to ‘learn’ a process that the user performs. A separate module allows for the conversion of 1-2-3 macros.

Excel has powerful macro facilities which let you pre-program the system to perform calculations and operations automatically. Excel macros have their own programming language and are created on separate macro sheets which are handled in the same way as spreadsheets. The macro commands are typed into cells in a column and, like cells on a spreadsheet, can be deleted, copied and moved around. You can have as many macro sheets as you want, and as many different macros as you can fit on each macro sheet. The macros can then be used with any spreadsheet.

You can incorporate branches and loops into your macro, and control can pass from one macro to another if certain conditions are satisfied. You can create ‘intelligent’ macros to interact with the user for example, to prompt for information at certain stages, using dialogue boxes.

There is also a group of macro commands for customising the appearance of the program itself. You can set up your own menu bar and menu options, and create your own commands and dialogue boxes. You can use these facilities to effectively create your own custom applications within Excel.

The automatic macro creation facility can be used to build macros if you want to avoid programming – this works by simply recording actions you perform. The ‘Record’ command starts the macro recording, after which you can perform the task you want to record. When you have finished, you give the ‘Stop Recorder’ command. When the macro has been recorded, it can be edited and added to just like any other macro. In fact, you can see your macro being created line by line as you perform the actions it records. You can do this if you place the macro sheet window next to your worksheet window.

Macros can also be used for creating new spreadsheet functions; these are called function macros as opposed to the command macros just described. The 131 functions already available cover most of the standard purposes I can envisage, but function macros can be created for more complex, customised requirements. A function macro called ‘PAYE’, for example, could be set up to calculate the tax due for a given set of variables such as gross pay, tax code, month, and so on. Function macros can be used in formulae in the same way as standard functions.



It is possible to create forms for the entry of information into a database section of an Excel worksheet. There is provision for creating search criteria for finding records.

Excel has on-sheet integrated database facilities with 11 database functions and a new feature, an automatic database form interface. Any rectangular area of the spreadsheet can be designated as the database area, after which its rows become database records and its columns database fields. All the database functions, like ‘EXTRACT’, ‘DSUM’ and ‘DMAX’, are then available for acting on the information, but these don’t interfere with other spreadsheet functions which can be used as normal.

The ‘Database Form’ command is used to bring up the automatically created form window, which you can use to enter, edit, delete and find records. The form resembles the standard form layout screen that many database programs provide, and makes using the database very simple.

The macro facilities can be used in conjunction with the database facilities to perform customised database operations and create customised database applications.

How Microsoft Excel compares to the Macintosh version

On running the Windows version of Excel for the first time, I was amazed at its similarity to the Macintosh version. The look of the spreadsheet with its cell grid, the cross pointer, the menu options and the way in which the mouse operated are all the same. The ways in which you create macros, databases and charts are the same, too. On closer inspection there are a few differences, all of them turning out to be improvements. The Windows version I used did not seem to be as fast, however, but the final release version should be faster as all the debugging code will have been removed.

The Windows version has all the features of the Mac version with many more besides. The first new feature difference I noticed was a status line at the bottom of the screen that gives brief explanations of each command as you move through the menu options – very helpful for the first-time user. Another feature is that you can choose between short and full menu options: short gives you the most commonly-used commands and may be more suitable for beginners; full gives you the complete range of commands.

On the Mac version you can adjust only column widths on a spreadsheet, but on the Windows version you can adjust the row height of individual rows as well. You can also use more than one font on a worksheet. Both these features give you a lot more flexibility in designing models and spreadsheet reports.

On the Windows version, there is a new ‘Arrange Windows’ command that automatically resizes and fits all your windows into neat boxes on the screen to let you see them all side by side. I found this feature very useful when my screen became cluttered with several spreadsheet windows.

The ‘Resume Excel’ feature from the Macintosh version has been enhanced in the form of the Workspace feature on the Windows version. This lets you save all open worksheets and window arrangements you are working on for any particular project as a workspace file, to which you can give a name. You can then reload that workspace file (or any other) if you wish to continue working on that project, and all your worksheets and windows will be opened up exactly as they were when you saved them.

The auditing and documentation features of the Windows version, described in the main text, are an important new addition that make the Windows version useful and practical, and there are also many new spreadsheet functions (see the ‘Function comparison’ box).

There are other differences too, but for day-to-day operations the programs are basically the same; and a Macintosh Excel user should have no problem at all getting to grips with complex spreadsheets on Excel for Windows. However, the Windows version offers more features and functionality which power users will find very useful indeed.

Data transfer

Data transfer facilities are very important, as you may often need to import data from other programs to Excel in order to perform analysis and create reports from it. Excel can read and write files in any of the following formats: text, CSV (comma separated values), SYLK, WKS and WK1 (Lotus 1-2-3), DIF, DBF2 and DBF3 (dBase II and III). This is a comprehensive range and facilitates the exchange of data with a wide variety of programs.

The Dynamic Data Exchange (DDE) protocol resident in Windows can also be used by Excel to exchange data with other programs running under Windows.


The Excel manuals are well up to Microsoft’s usual standard, and I didn’t have to refer to them too often since the program’s menu options are fairly self-explanatory.


Excel is an impressive program, and there is no reason why it should not ultimately overtake Lotus 1-2-3. It has superior power and ease of use, more facilities, and it is easy for 1-2-3 users to upgrade to. I have been a regular user of Excel on the Macintosh for some time, and I am confident that Excel for Windows will serve me equally well.

Function comparison      
Function Type Lotus          1-2-3 Excel (Windows) Excel (Macintosh)
Maths/Trig 17 26 18
Logical/Special 18 34 23
Text/String 18 21 8
Date & Time 11 12 10
Financial 11 13 8
Statistical 7 14 11
Database 7 11 7
89 131 85

Anthony Meier is a chartered accountant and computer consultant.

First published in Personal Computer World magazine, December 1987

Hard Disk Cards

Peter Jackson examines three hard disk cards for the IBM PC and compatibles – the Hardcard, the FileCard and the Dinasti, which are all examples of an obvious but long-awaited idea: the provision of hard disk capabilities in machines with floppies only.

Hard Disk Cards Main

The Hardcard, Filecard and Dinasti can all be plugged neatly into the internal bus of your PC or PC-compatible to give you more RAM (Photography by Philip Gatward)

When Plus Development showed off its Hardcard plug-in disk board for the IBM PC last summer, it suddenly seemed such an obvious idea that the only wonder was how long it had taken for someone to build it. The pieces of the puzzle had been there all along; Winchester disk controller cards could already fit into half-length IBM slots, while half-height and third-height 3.5in Winchester drives were already being built into various machines. All that was needed was for someone to notice that a 3.5in Winchester was smaller in diameter than the standard 4in height of an IBM card, and the rest was merely engineering.

A hard disk controller card could be built using all the latest single chip techniques, including custom integrated circuits and surface mounting technology, to squash the controller down to one end of a standard 13in IBM plug-in board. That would leave room to bolt a meta-cased, shock-mounted, half-height Winchester, like those being used in the new breed of portables, to the other end of the board. Plus Development got there first, and launched its 10Mbyte Hardcard in July last year.

In science there is something called the Eureka effect, where everyone in the field suddenly recognises an idea that has simplicity, elegance, and an indisputable air of correctness. Or, to put it another way, one instinctively knows when something is right. Hard disk makers were certainly shouting ‘Eureka!’ as they quickly announced instant competition for Plus’ Hardcard, with higher capacities than Plus’ 10Mbytes, and various add-on functions such as additional RAM, I/O ports, and real-time clocks.

There were around five such boards on view at the Comdex trade show in the US last November, and there are now something like a dozen on the market from all the expected hard disk names and some unexpected new names, too. It seems like a good time, as these drives start to appear in the UK, to examine three typical examples of the species: Plus Development’s original Hardcard, Western Digital’s FileCard and JVC’s catchily-named 3.5in Hard Disk Subsystem model JD-S3812MOSO.

Where they come from

Plus Development itself is not such a new name as it seemed last summer. Its parent company is California-based Quantum, a well-known designer and manufacturer of big, hard disk drives and controllers for the OEM market – that is, other companies buy bare drives and controllers from Quantum and build them into their own systems. Quantum could see opportunities in the micro market for retail storage products, and set up Plus Development to explore that business without making its big OEM customers nervous.

The work on the Hardcard design took over a year and a half, in conjunction with a manufacturing subsidiary of the giant Japanese corporation Matsushita, which actually puts the boards together.

In the UK, Plus has signed an exclusive distribution deal with Computer Marketing Associates, best known here for selling the IRMA communications board range from DCA in the US.

Western Digital, based in Irvine, California, has a similar but different background. The company has been designing disk controller boards and chips for a decade and more, first for minicomputers, then for early $100 micros, and then for PC-style single board computers. Its single-chip Winchester controllers are used in disk add-ons and add-ins from numerous other manufacturers, and perhaps it was this that prompted the company – like Intel before it – to set up an Enhanced Peripherals Division to market finished products to the retail market. Prime place in the list of three new launches last November went to the FileCard, a 10Mbyte drive on a single IBM card with a compact controller built around Western Digital’s own custom controller chips.

Western Digital’s Enhanced Peripherals division has announced its own distribution operation in the UK, which was launched in February this year.

JVC’s JD-S3812MOSO comes from one of Japan’s best-known names in just about every area of electronics. The Japanese Victor Company makes hi-fi, video recorders, compact disc players, TV sets, MSX computers, and now hard disk add-ons, among its other activities. The pedigree of the JVC board is not known, but in comparison with its US competition, the packaging and the manuals have the air of being put together at speed.

But it was one of the first boards to be available in the UK, through distributor Suffix; and Suffix has changed the name for obvious reasons. But the reasons for the final choice of name – Dinasti – are not so obvious.

Best of the rest

In the US, other boards available include the 20Mbyte DiskCard from top hard-disk maker Tandon, and the 20Mbyte DriveCard from Mountain Computer. In the UK, Plus 5 Engineering has launched the PlusCard and XTech has started shipping its Insider. And just to prove that the market is beginning to reach some kind of maturity, it got its first lawsuit at the end of February this year. Quantum, parent company of originator Plus Development, is suing Mountain Computer and any other plug-in Winchester maker using drives from NEC in Japan. Quantum alleges that the NEC drives infringe its patents, although both Mountain and NEC are strenuously defending the case.

First a lawsuit, and now the first official price cuts in the face of competition are under way. The plug-in Winchester business is obviously in a healthy condition.


Plus Development’s Hardcard has all the benefits of being first in the market, but it has all the drawbacks, too. The board was designed to be simple to install, simple to use, and usable in every IBM PC on the market, including those where a hard disk was already installed.

By aiming at all PCs, including those benighted machines with the old ROMs, cassette port, 64k motherboards and weedy power supplies, Plus could sell to upwards of three million PC and XT owners. But supporting the old machines means that the power consumption had to be kept down, and that the board had to fit in any long slot at all, keeping the capacity of the board’s Winchester drive down to 10Mbytes for the foreseeable future.

Opening the smartly-designed packaging, done out in Mothercare-style pastel blue, pink and green, reveals the result. The board, weighing around 1kg (2.11b), has a slim metal enclosure covering the 3.5in Winchester drive at the non-connector end, while the controller electronics are exposed above the bus connector. A glance at the electronics shows the work that has gone into keeping down power consumption and size; the board is tightly packed with big, surface-mounted, very-large-scale integration chips in their distinctive square packages, and most of the chip type numbers, including the custom logic circuits, contain the tell-tale ‘C’, showing that the chips use low-power-consumption CMOS technology. There are signs, though, that the controller hardware is still being developed; the board layout of the current model is certainly different from the original prototypes, and there is still an extraneous transistor soldered onto the back of the board to put right some small glitch.

Hard Disk Cards 001

The smallest, neatest and narrowest disk card here is the Hardcard

Apart from the board, the box contains only a plastic card guide (if required), along with one slender manual.

Installing the board is as simple as installing an IBM PC expansion board ever is. After sliding off the system unit cover and removing one of the metal slot panels at the rear, the board simply slides into the board guides and into the PC’s bus connector. The Plus manual flatly states that users should remove any existing board guide at the front of the machine and replace it with the one supplied with the board, but I had no problems with the guide that was already in place. Presumably, Plus wants to make sure that the board is firmly held in the guides to cut down any vibration when the drive is spinning, but the guide in my machine seemed tight enough. On the other hand, the manual’s instructions to screw the board firmly into the metal frame at the back of the machine are just good sense.

That is the entire hardware installation procedure, and was no more difficult than my experience of installing a 256k RAM board in the same machine. There is no need to change the DIP switch settings on the PC, and the only possible hardware change the user can make is to alter a ‘jumper’ on the Hardcard itself. The jumper is set at the factory, assuming that the Hardcard is going to be the first hard disk drive in the system, but if it is going to be the second hard drive – in an XT, for example – the jumper position needs changing. This is a simple task, too, since the jumper positions are straightforwardly labelled ‘PC’ and ‘XT’.

Powering up the re-assembled PC with the usual PC-DOS or MS-DOS system disk in floppy drive A brings up the usual prompt, and the real installation of the Hardcard, the software installation, can begin.

The drive on the board is already formatted at the factory, and also has an Install program ready-stored on it. Typing C:NSTALL C at the A> prompt starts the installation program running if the Hardcard is the only hard disk drive in the system, while if the jumper has been changed for an XT system, the Hardcard is drive D and the instruction is D:INSTALL D.

Either way, the Install program takes control of the system and goes step by step through a 10-minute installation procedure. This involves putting the system tracks and all the normal PC-DOS or MS-DOS utilities on the Hardcard. You also need a blank floppy during this procedure, as the Install program creates a ‘reinstall’ disk that can be used for setting up the Hardcard from scratch in case of error or glitch. Re-install deletes all the data files, though, so backing up these files should be done at the slightest provocation.

While the installation proceeds, you have the chance to notice one neat feature in the Hardcard’s ready installed software. One problem with plug-in hard cards for PCs is that there is no front panel indicator showing when the drive is active, and this can be helpful; for example, the machine can sometimes appear to have seized up during a long program compilation if it weren’t for hard disk activity. Plus provides a program called ‘Light’ on the board’s drive, which flashes a plus sign (+) at the top right-hand corner of the screen whenever a read or write operation is under way, at system level or inside an application. Another complementary program called ‘Sound’ beeps whenever a read or write operation takes place, which sounds appalling through the PC’s speaker. Either or both of these features can be turned on or off at any time.

At the end of installation, all you have to do is leave the door of the A floppy drive open and press the Ctrl-Alt-Del key combination for an ersatz reset. Just like the XT, the system will boot from the Hardcard if there is no disk in drive A.

At boot-up, the Hardcard runs an autoexec batch file to bring up the Hardcard Directory (HCD) program, also provided on the disk. This lets you set up a menu-driven front-end for all the applications installed on the Hardcard, and has 16 possible menu items. Each numbered item has its own MS-DOS sub-directory, and if you intend to use the HCD structure, you need to copy all the necessary files for each application into its own sub-directory. Then the HCD program lets you create a macro, or a miniature set of batch commands, called by selecting a particular menu item. For instance, if all the WordStar files are copied into sub-directory SUB1, associated with HCD menu item 1, then the macro you would enter would be CD/SUB1 <return> followed by WS<return>. Then the menu item name could be changed to WordStar, and after booting Hardcard, the program could be run simply by selecting that item.

Quitting any application called up from the HCD menu returns the user to that menu for another selection, and using HCD is about as friendly as using a menu ever is, which is not much. The software does have one disconcerting habit, though: it turns off the screen display after five minutes without use. Seeing the screen go dark from the corner of an eye, while half-waiting for the board to blow – thanks to heat problems – can add 10 years to your age.

I had no problems with the drive overheating, or overstraining the power supply, despite installing the Hardcard deliberately between the hot-running video board and the 256k RAM board in the Kaypro system. However, like most clones, the Kaypro PC’s power supply is rated at XT levels to handle a hard disk rather than at the early PC levels for floppy use.

All that aside, there is nothing to show, when using the system, that the Winchester drive is on a card. The machine with the Hardcard acts just like an XT with twin floppies, as indeed it should if Plus has done its market research and hardware design properly.

The general impression is very favourable. The Hardcard really is simple to install and use, it really does go in any long PC slot – including one in the Compaq or the IBM portable – and it is unobtrusive. With the usual noisy PC fan you can’t hear the drive at all, and must rely on the flashing plus sign for confirmation that the thing is running.

Plus has kept its original idea simple, elegant, useful – and tough. An inadvertent shock test involved dropping the board onto a carpeted floor from three feet, with a metal system unit casing instantly falling onto it from the same height. That was before installation, so unless the board was broken before and violent treatment fixed it, the Hardcard survived the test unscathed.


Like Plus Development, Western Digital obviously called in a design house to do its packaging: this time, the colours are white, grey and light green. Inside, the contents are similar, too, comprising the FileCard board itself and two slim manuals.

The board is obviously similar in structure to the Hardcard, with the metal drive case at one end and the controller electronics and bus connection at the other. With FileCard, though, there are differences. The drive casing is thicker than Hardcard’s, with its own mounting bracket at the end instead of a simple spline to slide into a card guide. The controller electronics are different, too, although they too make full use of VLSI, CMOS, custom circuits and surface-mounting technology. The controller, as you might expect from specialist Western Digital, is beautifully laid out and built, with no blatant kludges or late bolt-ons. It looks more complex than the controller on the Hardcard, and there is one obvious reason for that in the shape of an extra connector at the bottom of the card. This is meant to take a piggyback expansion board, and Western Digital has so far produced one containing up to 512k of add-on memory. Others are meant to be on the way, and would make it possible to combine a hard disk and a multifunction board in one card slot.

Installing the FileCard follows the same general lines as before, but Western Digital imposes some limitations on where the card can go. It can’t go in slot one, where the speaker is, since the speaker would get in the way of the drive casing. And although a selling point is that the board takes up just one slot, that is true only in the basic PC (and even then, not in slot one). In the XT, with its greater number of slots slightly closer together, the FileCard takes up a slot and a half. This means that the drive casing overhangs the space that a next-door long card would need, and only a short card such as the new Hercules colour card or a RAM card could be used in that slot.

The Filecard’s metal mounting bracket at the drive end of the board is designed specifically for the IBM PC, where the hole in the bracket matches up with a hole in the front panel, and the screw provided clamps the drive casing to the machine casing. On the Kaypro the bracket forces the removal of two card guides to install the board, and then there is no matching screw hole. A less-than-conspicuous appendix to the manual explains how to install the board using the card guides typically found in PC clones, but since the drive seemed to be working without guides or screw fixing, and was jammed in pretty tight due to the crowded interior of the machine, it seemed easier to leave it.

One disturbing feature of the FileCard installation, apart from the usual dire warnings about CMOS circuits and human static electricity, was the fact that the drive’s actuator and motor spindle were exposed through a hole in the back of the board – just where your fingers go when you are trying to wrestle the bulky board into its tight slot. There is a warning about this in the manual, stating that fingers should be kept out of the holes, but without giving any reasons for the holes to be there.

However, once again, that’s all there is to the hardware installation, as long as the piggyback memory board is not fitted to the FileCard. Installing that board just involves plugging it onto the socket provided and screwing it down to the main disk controller portion, and it does not make the board any thicker. The expansion board was not available for this review.

The software installation followed the same route as for the Hardcard, but was not trouble-free. Installing the DOS floppy in drive A and booting as usual, then running the Install program provided on the FileCard by typing C:INSTALL, started the procedure in conventional style. The program copied over the DOS files and utilities needed on the FileCard’s disk, and then, in a minimalist style typical of the software, said that it needed a blank disk in drive A, all the information on the disk would be destroyed, and did I want to continue. I typed Y for yes in answer to the last question, expecting a warning to remove the current disk and insert a blank. Instead, the program started to format my DOS disk.

Having produced another DOS back-up – for once I wasn’t using the master copy – I tried again. This time all went well, until the machine hung up with the message: ‘Installing DOS partition on the FileCard’. This was solved by rebooting the system with the re-install floppy in drive A; this re-formatted the hard disk and seemed to produce a FileCard working as it should.

Perhaps this was a compatibility problem, but it seems unlikely as all the utilities needed for the installation are generic MS-DOS types rather than pure IBM PC-DOS types. Whatever the reason, the experience did not inspire confidence, although it seemed to end well.

As with the Plus Hardcard, the installation assumes that the FileCard is the first hard disk in the system unless it is told otherwise, and the appropriate jumper changes on the FileCard controller. The jumpers on the FileCard also allow the user to select settings for one FileCard, one internal hard disk and one FileCard, or no internal hard disk and two FileCards, in a system.

Instead of FICD, a program specially produced for Plus Development, Western Digital provides a free copy of the commercial XTree program on the board’s disk. XTree, written by Executive Systems of Sherman Oaks in California, is a file-organising front-end for MS-DOS which keeps track of the chaos of files, directories, directories of directories, subdirectories, and so on, ad infinitum, created by using MS-DOS.

In its series of onscreen windows, XTree shows the structure of directories and their sub-directories graphically, as a ThinkTank-like indented tree structure. The root directory is at the top of the graphic window, with the others sorted alphabetically and extending down as far as necessary. Instead of changing directories with the CD/<name> command in MS-DOS, it is done by moving the cursor to the appropriate directory name in the tree picture. The files in that directory then automatically appear in the Files window below the tree picture, and a program can be executed simply by placing the cursor on the program filename and typing X for execute.

The third window contains a set of statistics relating to the currently selected directory, disk drive and file.

A wide range of DOS commands are available via the XTree menus and windows, and as with HCD, the idea is that programs can be selected from this front-end program, and quitting the application puts you back in the front-end just where you left it.

Apart from Xtree, the FileCard acts just like an XT’s hard disk, and although the noise of the drive was rather more obtrusive – the sound of the heads parking themselves in a safe landing zone, if the drive was idle, was a little disturbing at first – the drive acted reliably and easily. Western Digital claims that the power consumption of the board in use, complete with a 512k piggyback board, is an astonishingly low 6W. That is described as ‘typical power usage’, though, which probably means peaks pushing 10W when the drive motor is active and the heads are moving, and troughs down near zero when the drive is idle and the heads are parked. There was no way to measure the peak power consumption or the operating temperature, but the board did not get too finger-burning hot and there were no surprise glitches in performance.

The FileCard is a beautifully built and finished board that performs well once it is installed and formatted. The installation is tougher than with the Hardcard, thanks to the different support system and general bulk of the FileCard, but not more difficult than, say, a typical internal modem for the PC. The problems I had with the software installation caused me more worries, but eventually – and with bitten fingernails – I got it working. The installation software is rather too cryptic and leaves too much for the user to do, in my opinion, while not explaining what exactly is going on in words of one syllable.


The JVC JD-S3812MOSO, or Suffix Dinasti in the UK, has emerged from the new products department of the Japanese giant’s stereo division, and is another 10Mbyte drive on a controller card.

The difference here is that while the Hardcard has obviously been designed from the ground up to be as slim and unobtrusive as possible, and while the FileCard has been neatly integrated by expert controller designers, the JVC board shows signs of being bolted together out of what was available at the time.

The metal disk drive casing is thicker and heftier than the casings on the other two cards, and the controller section – although using the same Western Digital VLSI chips as the FileCard – is cruder and less finished. It is, too, the only board of the three to need a power supply direct from the PC’s mains power converter rather than draw its watts from the power supplies in the PC bus.

Although the board is as simple to install as the other two, there are some problems relating to its design. Firstly, there is no pretence that the board only takes up one slot. In the slim pamphlet which acts as a manual, translated from the Japanese with unintended comic effect, there is no indication at all of how many slot spaces the board requires.

In practice, it turned out to need an empty slot or only a short card on either side in the Kaypro PC, where the slot spacing is the same as that in the IBM XT. This is thanks to the bulky drive casing on the board, which sticks out an equal and awkward amount on both sides.

Then the power supply cable needs to be fitted, and the manual gives no real indication of how this is done. For the experienced PC user it is clear – though certainly not from the manual – that JVC intends you to remove the supply plug of one of the floppies and plug it into the splitter cable provided with the board. The two split ends then plug back into the floppy board and into the hard disk board.

It was actually simpler than this on the Kaypro, since this machine is already set up to take internal hard disks and has an extra power supply able already provided. In this installation, the splitter cable was only used to extend the hard disk supply cable to reach the JVC board.

After that, and after removing a piece of metal which was blocking the spline from sliding down the card guide, the board slipped into place just like the others.

Unlike the Hardcard and the FileCard the set-up and software installation done for you is minimal. The disk is physically formatted at the factory, but the user is still required, according to the manual, to create an MS-DOS partition on the disk and then format it for use with the operating system. This would be easy enough with the FDISK utility provided with MS-DOS, but in fact the review board had already had the partition installed, presumably by the UK distributor. The format is done with the normal Format utility, using the /S and /V options to install system tracks on the Winchester. The system will then boot from the hard disk if there is no floppy in drive A, and after that will act, unsurprisingly, just like an XT.

Once installed, the board worked with no problems at all, and was as fast and useful as the other two. True, it did not have Plus’ HCD system or Western Digital’s XTree, or the Hardcard’s neat extras like the flashing plus sign to show disk access. But it did the job, and was as unobtrusive in use as the other cards.

The main things in the JVC Dinasti’s favour are its cheap price and early availability, although there are now signs that Hardcards and FileCards are starting to arrive in quantity, along with the US competitors and UK contenders like the Plus 5 board and the Insider.

The main limitations are the size of the actual drive, which is bulky and would be awkward to cram into a PC with any kind of expansion already fitted; the lack of automatic installation and any helpful software; and the manual, which is skimpy to say the least.

Still, there is something to be said for keeping it simple, keeping it cheap, and stacking it high. Japanese companies have been good at all those things over the last 20 years, and no doubt the JVC Dinasti will be just the first of the Japanese hard disk boards to emerge.


The obvious market for all these hard disk boards is the large community of IBM PC owners, and owners of compatibles, who have previously been running their machines on floppies only. Most of these machines have no space to install an internal hard disk, and upgrading before these boards came along would have involved a bulky external Winchester box.

Now, as long as there is enough slot space, a hard disk can be plugged neatly into the internal bus, with no external sign that the machine has had its computing power boosted.

Choosing the right board from the many on offer depends on what type of capacity you want at what price. Of the three boards tested here, the JVC product would be fine if you have a lot of free slot space and know what you are doing with DOS partitions and the like. If you need RAM expansion as well, the FileCard would be the way to go. And if you want the smallest, neatest and narrowest package, then the Hardcard would be hard to beat. Of course, if 20Mbytes is essential, you would have to go to Mountain or Tandon, as these have a 10Mbyte maximum.

One area where the Hardcard scores is in future upgrading. If you want to add another 10Mbytes, another Hardcard can be plugged in as long as you have any spare slot. Likewise, two FileCards can be plugged in, but here the space issue is a bigger problem, particularly with an XT or a crowded compatible. I tried the Hardcard plugged in alongside the FileCard, and that worked as long as the jumper on one of them was changed to show that it was the second hard disk in the system – I changed the Hardcard jumper since it was easier to understand which way it went – and the second drive was installed or re-installed as drive D.

The JVC documentation flatly states that the Dinasti board must be the only hard disk sub-system in the machine – or, to put it the Japanese way: ‘When your computer has another Hard Disc Subsystem, cohabitation is not available.’

Some doubt must remain as to whether an IBM PC of really old vintage, with the 65W power supply and the 64k maximum motherboard, could cope with the extra 10W peak load when one of these boards is plugged in. And even if it could, it would need a new BIOS ROM set to handle the hard disk.

But the hard disk board manufacturers assure me that it is reasonable to use their products on such long-toothed machines. And naturally, all compatibles worth mentioning have a beefed-up supply that can cope easily.

The general impression of the three boards on test is that they are simple to install, simple to use, and improve the performance of a typically-sluggish IBM PC clone to a marked degree.

Technical specifications
  Hardcard FileCard Dinasti
Capacity 10.56Mbytes 10.7Mbytes 10.65Mbytes
Transfer rate 5Mbits/s 3.2Mbits/s 3.2Mbits/s
Average access 65ms Not known 147ms
Power required 10.9W 6.08W 5.5-9.7W
Size 13.4 x 4.2 x 1ins 13.4 x 4.1 x 1.1ins Not known
Weight 2.1lbs 2.2lbs 1.76lbs
Price £775 £795 £605

First published in Personal Computer World magazine, May 1986