Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 10 of 11 total
Thread In Memory.
Thu, Aug 3 2017 2:19 AMPermanent Link

Steve Gill

Avatar

Ok, here's my latest crazy idea.  This is a client/server system.

I was thinking (dangerous, I know), that if I created frequently used tables as memory tables it would improve performance by bypassing the file system.  Although at some stage the system would have to copy the data back to the disk-based tables.  If the server decided to crash or someone rebooted it before then it would be disastrous as they would lose the data in memory.  Not so good.  Maybe it could be done when the server is idling.

I'm just thinking this stuff through at the moment.  The idea may end up being totally insane.  Smile

Anyhoo, from what I've read in the EDB manual, it's possible to create in-memory databases but not individual in-memory tables.  Is that correct?

Maybe I could create an entire in-memory database. Or perhaps an in-memory database that only has the frequently used tables.  

Is there an easy and reliable way to create in-memory databases/tables *with* data from a file-based database, and copy it back again, or do I just use standard SQL to do it?  Just thought I'd ask in case there was some cool, secret method I didn't know about. Smile

Thanks.

= Steve
Thu, Aug 3 2017 4:49 AMPermanent Link

Matthew Jones

Steve Gill wrote:

> my latest crazy idea

I'd stick with that. 8-)

First step is to look at the server, and see how it is doing for memory. Most of the time, the disk should be being cached, such that anything anywhere near active should be in memory anyway. Thus the time taken to read should be pretty insignificant compared to programming it differently. Particularly if the data is actually critical as you suggest.

Then I'd look at how I can increase such caching, look at the ElevateDB settings that allow you to cache lots, and generally put the effort into that.

You might be able to make improvements with in-memory data, but if it matters, I'd keep it all on disk. Anything else adds risk of sync problems etc. If it was temporary, then go for it.

(All my general feeling of course, in some situations the risk would be worth taking, with increased development and test cost of course.)

--

Matthew Jones
Thu, Aug 3 2017 7:54 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Steve


I disagree with Matthew - I don't think its a crazy idea at all but I do think it will take a lot of work, and how useful it will be depends on what you want to do with the tables.

My recollection may be wrong but I think you'd need to do something like:

1. Create a local session
2. Create an in-memory database using the local session

these two steps mean that your tables will be held locally

3. Create the appropriate tables in the local database in the local session
4. Stream data from the server
5. Load the streamed data into the local in-memory sessions

alternatively

Create a thread and use navigation methods to load the tables

The single biggest problem is that you can't use sql across session boundaries. If all you need are navigational methods, or to do things to the tables in the in-memory database you're ok.

For updating purposes I'd first see if its possible to repurpose the publishing capability built in (I suspect not), or create a thread that simple checks for changes and if found uses navigational methods to post across to the server (or creates sql that can be run in the context of the c/s session & database). You could equally well have an after post event for each table.


To comment more, as C5 would have said "input input"

Roy Lambert
Thu, Aug 3 2017 10:34 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Steve,

<< Is there an easy and reliable way to create in-memory databases/tables *with* data from a file-based database, and copy it back again, or do I just use standard SQL to do it? >>

Sure, you can use the BACKUP DATABASE/RESTORE DATABASE statements to do so.

However, don't do this.  You'll have to keep the in-memory database segmented from your "real" database, which is a real PIA in terms of your application design, and what Matthew said: Windows and ElevateDB are *already* caching most of your data in memory, at least the data that needs to be in memory.

There's a great little tool from MS called RAMMap that will show you exactly which files are cached, and how much of them are cached:

https://docs.microsoft.com/en-us/sysinternals/downloads/rammap

Are you actually seeing performance issues ?

Tim Young
Elevate Software
www.elevatesoft.com
Thu, Aug 3 2017 6:28 PMPermanent Link

Steve Gill

Avatar

<< First step is to look at the server, and see how it is doing for memory. Most of the time, the disk should be being cached, such that anything anywhere near active should be in memory anyway. Thus the time taken to read should be pretty insignificant compared to programming it differently. Particularly if the data is actually critical as you suggest.

Then I'd look at how I can increase such caching, look at the ElevateDB settings that allow you to cache lots, and generally put the effort into that.>>

Thanks Matthew.  I'll look into the EDB cache settings.

= Steve
Thu, Aug 3 2017 6:31 PMPermanent Link

Steve Gill

Avatar

<< I disagree with Matthew - I don't think its a crazy idea at all but I do think it will take a lot of work, and how useful it will be depends on what you want to do with the tables.

My recollection may be wrong but I think you'd need to do something like:

1. Create a local session
2. Create an in-memory database using the local session

these two steps mean that your tables will be held locally

3. Create the appropriate tables in the local database in the local session
4. Stream data from the server
5. Load the streamed data into the local in-memory sessions

alternatively

Create a thread and use navigation methods to load the tables

The single biggest problem is that you can't use sql across session boundaries. If all you need are navigational methods, or to do things to the tables in the in-memory database you're ok.

For updating purposes I'd first see if its possible to repurpose the publishing capability built in (I suspect not), or create a thread that simple checks for changes and if found uses navigational methods to post across to the server (or creates sql that can be run in the context of the c/s session & database). You could equally well have an after post event for each table. >>

Thanks Roy, it sounds like a lot of work for something that might end up being more trouble than it's worth.

= Steve
Thu, Aug 3 2017 6:46 PMPermanent Link

Steve Gill

Avatar

<< Sure, you can use the BACKUP DATABASE/RESTORE DATABASE statements to do so.

However, don't do this.  You'll have to keep the in-memory database segmented from your "real" database, which is a real PIA in terms of your application design, and what Matthew said: Windows and ElevateDB are *already* caching most of your data in memory, at least the data that needs to be in memory.

There's a great little tool from MS called RAMMap that will show you exactly which files are cached, and how much of them are cached:

https://docs.microsoft.com/en-us/sysinternals/downloads/rammap  >>

Thanks I'll check that out.

<< Are you actually seeing performance issues ?  >>

The main reason behind this is I want the software to be as fast as possible so I'm just looking at things that can speed things up.

However, I am also seeing performance issues with a couple of customers.  They have around 15 users each and it's running pretty slow.  I suspect it's anti-virus software because I have customers with a lot more users and they're working fine.

Both of the systems in question are managed by IT support companies and things are locked down pretty tight.  
I have remote connected to both but can't access the AV software to see what happens when it's turned off.  And I can't get even get access to their servers.

I have to deal through the IT support companies and they're always slow to respond and hard to coordinate with.  I'm often surprised how little some of the "techs" actually know.  It's often an uphill battle explaining things to them.

Anyway, I have one of them running both workstation and server traces to see what's going on to see if there is anything slowing things down.  Maybe it might show me where efficiencies can be made.

= Steve
Fri, Aug 4 2017 3:04 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Tim


Nice utility -thanks for pointing it out

Roy Lambert
Fri, Aug 4 2017 3:09 AMPermanent Link

Roy Lambert

NLH Associates

Team Elevate Team Elevate

Steve


It really depends on the application but what about going all local with publishing?

Depending on how you set it up everyone's data will be out of sync by a greater or lesser extent which may be a deal killer.

Roy Lambert
Fri, Aug 4 2017 11:05 AMPermanent Link

Tim Young [Elevate Software]

Elevate Software, Inc.

Avatar

Email timyoung@elevatesoft.com

Steve,

<< Anyway, I have one of them running both workstation and server traces to see what's going on to see if there is anything slowing things down.  Maybe it might show me where efficiencies can be made. >>

That's the ticket, for sure.  If you have any issues interpreting the results from either, just email them to me and I'll see if there's any really weird results occurring.

Tim Young
Elevate Software
www.elevatesoft.com
Page 1 of 2Next Page »
Jump to Page:  1 2
Image