  
Week 3
In Review
The third and final week of this book covered several very important topics. This
week's work focused on database issues you encounter when you develop database applications
for multiple users or multiple sites. You learned advanced SQL language for manipulating
records within existing databases (DML). You also learned the five rules of data
normalization and how applying those rules can improve the speed, accuracy, and integrity
of your databases.
You learned about Visual Basic database locking schemes for the database, table,
and page level. You also learned the advantages and limitations of adding cascading
updates and deletes to your database relationship definitions. You learned how to
use Visual Basic keywords BeginTrans, CommitTrans, and Rollback
to improve database integrity and processing speed during mass updates.
You were introduced to the Remote Data Control and the Remote Data Objects. You
learned to use these tools to attach to RDBMSs. You learned the properties, methods,
and events of these useful tools.
You learned how to write data entry forms that use the ODBC API calls to link
directly with the ODBC interface to access data in registered ODBC data sources.
You also learned how to install the ODBC Administrator and how to create new ODBC
data sources for your ODBC-enabled Visual Basic programs.
You learned how to distribute data across multiple sites by using database replication.
You learned how to create a Design Master and replicas. You learned how to synchronize
data changes to a member of a replica set. You also learned how not to distribute
specified data tables during the synchronization process.
In the final lesson, you learned how to create application-level security schemes
such as user login and logout, program-level access rights, and audit trails to keep
track of critical application operations.
Day 15: Updating
Databases with SQL
To start your final week of study, you learned how to add, delete, and edit data
within tables using the DML (Data Manipulation Language) SQL keywords. You learned
that by using DML statements you can quickly create test data for tables and load
default values into startup tables. You also learned that DML statements--such as
Append queries, Make Table queries, and Delete queries--can
outperform equivalent Visual Basic code versions of the same operations.
You learned how to manage data within the tables using the following DML keywords:
- The INSERT INTO statement can be used to add new rows to the table using
the VALUES clause.
- You can create an Append query by using the INSERT INTO_FROM
syntax to copy data from one table to another. You can also copy data from one database
to another using the IN clause on an INSERT INTO_FROM statement.
- You can create new tables by copying the structure and some of the data using
the SELECT_INTO statement. This statement can incorporate WHERE,
ORDER BY, GROUP BY, and HAVING clauses to limit the scope
of the data used to populate the new table you create.
- You can use the DELETE FROM clause to remove one or more records from
an existing table. You can even create customized views of the database using the
JOIN clause, and remove only records that are the result of a JOIN
statement.
Day 16: Database
Normalization
In Day 16's lesson, you learned how to improve database integrity and access speed
using the five rules of data normalization. You learned the following five rules:
- Rule 1: Eliminate repeating groups. If you have a set of fields that have the
same name followed by a number (Skill1, Skill2, Skill3, and so forth), remove these
repeating groups, create a new table for the repeating data, and relate it to the
key field in the first table.
- Rule 2: Eliminate redundant data. Don't store the same data in two different
locations. This can lead to update and delete errors. If equivalent data elements
are entered in two fields, remove the second data element, create a new master table
with the element and its partner as a key field, and then place the key field as
a relationship in the locations that formerly held both data elements.
- Rule 3: Eliminate columns not dependent on keys. If you have data elements that
are not directly related to the primary key of the table, these elements should be
removed to their own data table. Only store data elements that are directly related
to the primary key of the table. This particularly includes derived data or other
calculations.
- Rule 4: Isolate independent multiple relationships. Use this rule to improve
database design when you are dealing with more than one one-to-many relationship
in the database. Before you add a new field to a table, ask yourself whether this
field is really dependent upon the other fields in the table. If not, create a new
table with the independent data.
- Rule 5: Isolate related multiple relationships. Use this rule to improve database
design when you are dealing with more than one many-to-many relationship in the database.
If you have database rules that require multiple references to the same field or
sets of fields, isolate the fields into smaller tables and construct one or more
link tables that contain the required constraints that enforce database integrity.
Day 17: Multiuser
Considerations
On Day 17, you learned about the three important challenges that face every database
programmer writing multiuser applications:
- Database locking schemes
- Using cascading updates and deletes to maintain database integrity
- Using database transactions to provide commit/rollback options for major updates
to your database
You learned that there are three levels of locking available to Visual Basic programs:
- The database level--You can use the Exclusive property of the data control or
the second parameter of the OpenDatabase method to lock the entire database.
Use this option when you need to perform work that affects multiple database objects
(such as tables, queries, indexes, relations, and so on).
- The table level--You can set the Options property of the data control to 3 or
the third parameter of the OpenRecordset method to dbDenyRead+dbDenyWrite
in order to lock the entire table for your use only. Use this option when you need
to perform work that affects multiple records in a single table (for example, increasing
the sales price on all items in the inventory table).
- The page level--Microsoft Jet automatically performs page-level locking whenever
you use the data control to edit and save a record, or whenever you use Visual Basic
code to perform the Edit/AddNew and Update/CancelUpdate
methods. You can use the LockEdits property of the Recordset to set the page locking
to pessimistic (to perform locking at edit time) or optimistic (to perform locking
only at update time).
You learned how to use Visual Basic to enforce referential integrity and automatically
perform cascading updates or deletes to related records. You learned that there are
times when it is not advisable to establish cascading deletes (for example, do not
use cascading deletes when the base table is a validation list and the foreign table
is a master).
Finally, you learned how to use database transactions to protect your database
during extended, multitable operations. You learned how to use the BeginTrans,
CommitTrans, and Rollback methods of the workspace object. And
you learned some of the advantages and limitations of transaction processing.
Day 18: Using the
Remote Data Control and the Remote Data Objects
On Day 18, you learned about two alternate methods for accessing remote data.
You learned that you can use the Remote Data control to create simple data entry
forms with data-bound controls. You also learned to use the Remote Data Objects to
create Visual Basic 5.0 programs that can access data from a remote RDBMS.
Along with the details of the Remote Data Control and the Remote Data objects,
you also learned some of the basics of remote data access in general. You learned
about:
- Cursor drivers--These are the tools that manage the location of the Recordset
pointer in a dataset. You learned you can use client-side or server-side cursor drivers
with RDC/RDO connections.
- Dataset types--You learned there are a number of dataset types available to you
when you connect to remote data sources including ForwardOnly--ReadOnly sets, Static
sets, Keysets, and Dynamic sets.
- Lock types--You learned there are several different lock types you can use when
accessing data from your remote data source. You can use ConcurrentLock sets that
perform locks as soon as you receive the data rows, or you can use several versions
of optimistic locking that only attempt to lock the rows when you update them.
You also learned the details of the following Microsoft Remote Data Objects:
- The rdoEngine object--This is the top-level data engine used to access remote
data.
- The rdoEnvironment object--This is the RDO equivalent of the Microsoft Jet Workspace
object.
- The rdoConnection object--This is the RDO equivalent of the Microsoft Jet Database
object.
- The rdoResultset object--This is the RDO equivalent of the Microsoft Jet Recordset
object.
- The rdoTable object--This is the RDO version of the Microsoft Jet Table object.
- The rdoColumn object--This is the RDO version of the Microsoft Jet Field object.
- The rdoQuery object--This is the RDO version of the Microsoft Jet QueryDef object.
- The rdoParameters object--This is a special collection of query parameters for
the rdoQuery object.
Day 19: ODBC Data
Access via the ODBC API
On Day 19, you learned how to use the Open Database Connectivity (ODBC) API to
directly link your Visual Basic program to target data sources through the ODBC interface.
The ODBC interface is generally faster than Microsoft Jet when it comes to linking
to ODBC-defined data sources.
You also learned how to install the ODBC interface on your workstation and how
to use the ODBC Administrator program to install ODBC driver sets and define data
sources for ODBC connections.
You learned how to build a program library that uses a minimum set of ODBC API
calls along with several Visual Basic wrapper routines. This library set provides
the basic functions necessary to read and write data to and from a defined ODBC data
source. You can use these routines to create fully functional data entry forms for
ODBC data sources.
Finally, you used the library routines to build a data entry form that opens a
link to a defined ODBC data source and allows the user to read and write data records
for the ODBC data source.
Day 20: Database
Replication
Day 20 focused on database replication. In database replication terminology, the
main or central database is referred to as the Design Master. A copy of the Design
Master is referred to as the replica. The combination of the Design Master and all
replicas is referred to as the replica set. Database replication is the process of
synchronizing data so that it is the same across all members of the replica set.
Database replication is a good tool to use in the development of systems deployed
across a WAN or to remote users. Replication can also be used to make copies of databases
that cannot be shut down. Replication is also good for creating reporting databases
and data marts.
Do not use database replication when a centralized data storage facility can be
used, such as a Web-enabled application. Also, don't use replication in heavily transaction-oriented
applications, or in applications where up-to-the minute accuracy is of paramount
importance.
Tables, fields, and properties are added to a database when it is made a Design
Master. The addition of these items is necessary to track changes to data and to
facilitate the synchronization between members of the replica set. These additions,
however, consume additional physical hard drive space.
Creating and changing the Replicable property of a database to T creates
a Design Master. Once the Design Master is created, you can use the Make Replica
method to make copies of it. Finally, you use the Synchronize method to
replicate data changes to members of the replica set. Data synchronization is the
act of copying data changes from one member of a replica set to another.
The Synchronize method can be used to import data changes, export data
changes, perform "two-way" data changes, and even perform data exchanges
over the Internet.
Synchronization errors occur when two members of a replica set try to synchronize
records that both have changed. Errors may also occur during the synchronization
process when design changes are made to a database that are violated by replicas
prior to synchronization of the changes. Violation of referential integrity can be
encountered by a replica that added records to its database that uses validation
records deleted in another replica. Record locking in a multiuser environment can
also cause synchronization errors.
There are four topologies for the synchronization of replicas. These are the star,
linear, ring, and fully connected topologies. The star is the most common, but like
all the other topologies it has certain strengths and weaknesses.
There may be times when you do not want to replicate objects contained in one
database to other members of the replica set. If such is the case, use the KeepLocal
method before you create the Design Master. This method keeps the object from being
copied to other replica set members.
Day 21: Securing
Your Database Applications
In your final lesson in this book, you learned several methods that can improve
user and application-level security for your Visual Basic database applications.
You learned about the limitations of using the Microsoft Access SYSTEM security
file and database encryption.
This lesson also showed you how you can add application-level security to your
Visual Basic programs by adding user login/logout routines and creating a user access
rights scheme for your applications. In this lesson, you designed and implemented
an OLE Server DLL library that you can use for all your Visual Basic applications,
and you created several screens for maintaining user lists and managing access rights
for each user.
You also learned how to add an audit trail option to your programs. You added
routines to a new OLE Server DLL library that logs all critical user activity to
an audit trail file, including user logins, database modifications, and all critical
program operations, such as running reports or processing mass database updates.
  
|