Welcome guest. Before posting on our computer help forum, you must register. Click here it's easy and free.

Author Topic: Intermittent slowness with Web App running on local server  (Read 9070 times)

0 Members and 1 Guest are viewing this topic.

RickNCN

  • Guest
Intermittent slowness with Web App running on local server
« on: February 25, 2008, 08:17:11 PM »
Small business, using a web app on local server. (SBS 2003 with all patches, etc) The server was setup by the web app company. The server is a domain controller. All PCs are on the domain, using domain logons. We have problems where there is very slow response from the application...sometimes. This happens all the time from one particular XP client and also working directly from the server, but maybe to a lesser degree. Typing in boxes in the forms were sometimes excruciatingly slow. As you type, the text doesn't show up right away, but seems to stay in a buffer and then shows up seconds later. Also, pressing buttons doesn't always have an immediate effect - sometimes you wait seconds before the effect of the button press happens.

After speaking with the Web App company, we decided it wasn't a problem with the app or the web service IIS config but seems more to be a networking problem. All physical network problems have been ruled out. So it's not a Media Layer problem but a Host Layer problem somewhere.

After testing, I've found that the problem follows the PC and not the user. In other words, we have the slowness when logging into the web app on the server directly and on PC2.

WHAT the heck is happening here?

One other bit of info that may help pinpoint the problem: It seems that when the programmer guy from the web app company - I'll call him tech support - remotes into the server using remote desktop - terminal services - it doesn't happen but when he remotes into the server using VNC it DID happen for him today. I can't explain exactly how those two methods are different, but I know they are. You probably know more about it than I but I believe with remote desktop he's creating a separate new virtual desktop session because when he's RD'ed in, I don't see his actions and he doesn't see mine on the server, but with VNC, it's more of a video hook that simply displays and gives control of what's directly on screen for the logged on user. So - whatever you call that, it seems to happen for one and not the other.

I'll check cpu load, but I'm almost certain it's not that - for another reason, if it's a cpu load problem on the server it should affect all XP clients at that time and it doesn't. Only the one pc and the server are affected.