Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 9 of 9 total
Thread Web server and database server on same machine
Thu, Mar 2 2006 11:28 AMPermanent Link

Danny Humphress
Our main product is mostly Windows based but we do have one web-based (CGI) component that accesses the same database. This has worked fine
for us through the years but we now have a large client that is putting a strain on the system and I'm not sure how to proceed.

Here's the situation:

1. Very "beefy" server with lots of memory and multiple Xeon processors
2. Data resides on server HD
3. DBISAM database server runs on server as a service
4. Windows applications on client PCs access data through the database server (remote)
5. Web application runs on IIS on same server and accesses data directly (not remote)
6. The web application mostly runs queries for users searching the database

Everything runs pretty well until we start getting a lot of traffic on the web application. At that point, users of the Windows application start seeing
significant delays in posting updates to the database. The system is for library management. Users report that during times of high web activity, it can
take 2-4 minutes to check out a book -- something that normally takes a fraction of a second. Server utilization soars to 100% and hovers there until
web activity goes back down.

I have considered the following:

1. Moving the web application to another server. The data would then have to be pulled across the network and accessed either directly or through the
server. My client resists this option because of the costs involved but it might be the best solution. Still, I am worried that the problem might just be
that DBISAM just doesn't do well with a dozen or so simultaneous queries and it won't improve much by moving the web application.

2. Converting the web application from CGI to ISAPI. I have been working on this but it has proven difficult (running into errors that are difficult to
trace). I am concerned that this might actually make the situation worse because at least now the process will timeout if the user sends it on wacky
query -- with ISAPI, it would continue to run and bog down the system until it completes.

Does anyone have any advice?
Thu, Mar 2 2006 12:04 PMPermanent Link

Jose Eduardo Helminsky
Danny Humphress

You don't specify the data size. But if possible I will convert this queries to tables if possible. Depending on the size of data even an optimized query will take a while to result data instead of
tables that *ALLWAYS* is faster (without filters) with appropriated index. I don't think moving IIS to another machine will solve the problem. Maybe another guy will give you some advices.

BTW, I am very interesting in how could you solve or minimize the troubles. Good luck

Eduardo
Thu, Mar 2 2006 1:14 PMPermanent Link

Danny Humphress
Jose,

The database is pretty big -- about 1GB.

We make extensive use of SQL (joins, etc.), so it would not be practical to move to tables.

Danny
Thu, Mar 2 2006 3:09 PMPermanent Link

Jose Eduardo Helminsky
Danny Humphress

>> We make extensive use of SQL (joins, etc.), so it would not be practical to move to tables.

I understand. Another advice is avoid joins and instead of use calculated fields. In this way you can stay away from canned results and the performance will be much better but I know
sometimes there is no way to AVOID joins. Let's wait to another advices.

Eduardo
Thu, Mar 2 2006 3:09 PMPermanent Link

Jose Eduardo Helminsky
Danny Humphress

>> We make extensive use of SQL (joins, etc.), so it would not be practical to move to tables.

I understand. Another advice is avoid joins and instead of use calculated fields. In this way you can stay away from canned results and the performance will be much better but I know
sometimes there is no way to AVOID joins. Let's wait to another advices.

Eduardo
Thu, Mar 2 2006 3:09 PMPermanent Link

Jose Eduardo Helminsky
Danny Humphress

>> We make extensive use of SQL (joins, etc.), so it would not be practical to move to tables.

I understand. Another advice is avoid joins and instead of use calculated fields. In this way you can stay away from canned results and the performance will be much better but I know
sometimes there is no way to AVOID joins. Let's wait to another advices.

Eduardo
Fri, Mar 3 2006 10:59 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Danny,


<< Everything runs pretty well until we start getting a lot of traffic on
the web application. At that point, users of the Windows application start
seeing significant delays in posting updates to the database. The system is
for library management. Users report that during times of high web activity,
it can take 2-4 minutes to check out a book -- something that normally takes
a fraction of a second. Server utilization soars to 100% and hovers there
until web activity goes back down. >>

Are you using the latest versions with the reduced read locking that was put
in place for you ?  How many users (approximately) are querying at the time
when the slowdowns occur ?  Have you tried bumping up the
TDBISAMEngine.TableMaxReadLockCount property ?  That will slow down the
queries but improve the concurrency.

http://www.elevatesoft.com/dbisam4d5_tdbisamengine_tablemaxreadlockcount.htm

Also, are you still allowing un-optimized queries on the database tables
that cause a brute-force read of the table ?

--
Tim Young
Elevate Software
www.elevatesoft.com

Mon, Mar 6 2006 12:22 PMPermanent Link

Danny Humphress
Tim,

Yes, we are using the latest versions. I assume the read locking changes are included in all newer versions. I haven't done anything with
TableMaxReadLockCount (using default). Where should I change this (on the Windows-based app that slows down and/or the web app) and
what value range do you suggest?

I'm not sure how many users, but I have seen things slow down when there are just a handful (maybe 2-3) queries running.

There shouldn't be any un-optimized queries though most are only partially optimized because of joins and multiple WHERE clauses.

Danny


"Tim Young [Elevate Software]" <timyoung@elevatesoft.com> wrote:

Danny,


<< Everything runs pretty well until we start getting a lot of traffic on
the web application. At that point, users of the Windows application start
seeing significant delays in posting updates to the database. The system is
for library management. Users report that during times of high web activity,
it can take 2-4 minutes to check out a book -- something that normally takes
a fraction of a second. Server utilization soars to 100% and hovers there
until web activity goes back down. >>

Are you using the latest versions with the reduced read locking that was put
in place for you ?  How many users (approximately) are querying at the time
when the slowdowns occur ?  Have you tried bumping up the
TDBISAMEngine.TableMaxReadLockCount property ?  That will slow down the
queries but improve the concurrency.

http://www.elevatesoft.com/dbisam4d5_tdbisamengine_tablemaxreadlockcount.htm

Also, are you still allowing un-optimized queries on the database tables
that cause a brute-force read of the table ?

--
Tim Young
Elevate Software
www.elevatesoft.com

Tue, Mar 7 2006 3:18 PMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Danny,

This was solved via email - the problem was the TableMaxReadLockCount
setting being too low combined with using a database-wide transaction
instead of a restricted transaction when only updating a few tables.

--
Tim Young
Elevate Software
www.elevatesoft.com

Image