The Database Approach.
HPM states that most medium and larger firms have been in a
transition towards consolidating their databases so that it can be
shared and used throughout an organization. This certainly
seems true in my experience. Information is being used more
and more as a competitive weapon.
Some Major Advantages. Then HPM
lists some of the major advantages of what they call the database
approach.
- program data independence -
data descriptions are stored in a central
repository - this helps make it so that data
can change and evolve (within limits)
without changing the applications that
process the data
- I consider this an ideal
that can often be achieved (within limits)
- this was much more
difficult to achieve when certain data or
the nature of the data wasn't very well
known to everyone that really could use it
- planned data
redundancy - organizing the data so that
it is centrally coordinated and
controlled makes it less likely that
there are needless redundancies
- for example - you are
less likely to have more than one group
collecting the same data
- planned redundancies can
occur for a variety of reasons - but it is
desirable to have these occur due to
intentional choices rather than lack of
organization and awareness of what is going
on elsewhere
- improved data consistency
- coordinating the data makes it much less
likely that there will be conflicting data
within the organization
- for example - do you
really know a customer's current address?
- for example - if a
customer's address changes then it is done
with much higher validity and accessibility
- improved data sharing -
obviously if the data that is available can
be known throughout the entire organization
then there will be much better sharing of
data and possibly insights gleaned form the
data and what other data should be collected
- user views - developed so
that a user can examine some salient subset
of the data - usually done with a query or
report
- improved application
development efficiency -
- if data is already
available and well organized then the
developers can more easily make use of it
and design their applications appropriately
- if certain data isn't
available and is needed they are more likely
to be able to start getting it
- a lot of tools will very
likely already be built into the DBMS
- report generators
- query tools
- end user form development
- if data can be securely
accessed through the Internet then this
increases availability considerably
- interface development can
also be done using fairly common and well
known approaches based on HTML, JavaScript
and some sorts of middleware
- much less likely to need
developers for fairly obscure languages
-
enforcement of standards -
this is generally a strong
advantage - when approaches
are no longer fragmented and
isolated standards can be
more easily set that apply
to everyone and work to the
entire organization's
advantage
- improved data quality -
this is a huge issue if you've ever really
worked with data - data quality is never as
good as it should be
- who entered it?
- were mistakes made?
- can procedures be
implemented to improve the quality?
- improved data
accessibility and responsiveness - this
should be apparent from the other
discussions
- reduced program
maintenance - this is possibly more
theoretically true than really true - but
generally these sorts of approaches to
database should ensure that at least the
data sources and results meet certain
standards. This helps make maintaining
programs much easier at least for these
aspects.
- improved decision support
- some databases are expressly designed for
decision support.
- think about how data
mining of sales data should be much
improved.
- think about how someone
should be able to access all the data they
need to analyze the entire supply chain and
not just fragments
- financial analyses should
also be improved
- what are the real costs?
- what is really making the
firm money?
- organized central source
of expertise - this can really allow an
organization to have high caliber database
administrators and network/security
administrators working out of a synergistic
single source
- unfortunately - it does
happen that this central source can be
overly controlling - and worse - constantly
use their specialized knowledge to unfairly
impact what projects get done and which do
not even get started - I've definitely seen
these central sources work very hard to do
as little as possible while trying to make
sure no other expertise develops elsewhere
When I was
a professor at Rider
University in NJ, I worked
on some analyses to help
them predict which students
that have applied would
actually matriculate in the
early 1990s. I was
astounded at how
uncoordinated and unreliable
most of the data was.
Admissions had its own data
sources. Financial Aid
was on a completely
different system.
These didn't really even tie
into the system that was
coordinating course
registration, fees and
degree progress. There
were plenty of other
problems.
Many of these problems were straightened out by
using a single system to organize all of this sort of data within
the University. They chose to go with Datatel. But unfortunately, this system wasn't all that
user friendly and it definitely did not have a Windows interface.
But at least by having one coordinating system they were much more
confident that they knew where to find things. They also took
fairly strong measures to help ensure that the data that was entered
was reliable.
But I should say more. I was on fairly good
terms with the system administrators at Rider. They were all
truly experts in VAX - VMS systems, and it was easy to respect their
intelligence. So there was one administrative VAX super-mini
and two academic VAX super-minis. Unfortunately, even though
this was the 1990s we could not get them to move towards UNIX or
Windows. So the implementation they chose to use of Datatel
was really quite primitive in comparison to what they had for other
platforms.
Now for another interesting question/issue.
When Rider decided to totally revamp their library systems how much
should they be compatible with their existing systems? Rider
Libraries did choose to go with a Windows based system that was
largely entirely separate from other academic computing foci.
They fairly cleverly chose to be a test site for one of the first
Windows based library information systems. They got a great
deal and were confident they would get great treatment because the
company doing the development really wanted to use them and one
other school for referrals and examples of how their systems really
worked. This plan appeared to work like a charm to me.
For example, the computing group within the
faculty had two of their own Windows servers and a computer lab
focused on teaching computing courses. These Rider
administrators were usually quite reasonable to work with when it
came to these sorts of needs even though they wouldn't implement
these sorts of things in their own central data center.
Some Major Disadvantages.
Most of these disadvantages
can also be turned into
advantages.
- new specialized personnel
- how expensive are they?
- who will manage them?
- how much are they actually
contributing to the organization's competitiveness and
functionality?
- installation costs and complexity
- different approaches require
higher costs and complexity
- need for explicit backup and recovery
- in my mind more of an advantage
than a disadvantage
- organizational conflict
- probably one of the biggest
problems relating to human, or maybe I should say
inhuman, quests for selfish power and money
- who gets the leverage of being in
charge and control of the database systems?
- who might get left behind if they
do not want to progress?
- do the systems really meet the
user's needs?
I am
leaving out the discussion
of traditional file
processing systems for a
number of reasons. |