NuSphere Forums Forum Index
NuSphere Forums
Reply to topic
DBG Very Slow w/ Large Datasets in Memory


Joined: 25 Jul 2006
Posts: 70
Reply with quote
While trying to debug remotely executing code it can be excruciatingly slow when there is a large amount of data in memory registers.

I can see how it would take some time to transfer this data to the debug client the first time but it acts like its transferring all of this data after every code step.

Is that right? Is there something I can do to speed this up? It can literally take 4 to 10 seconds for every line of code to step into...
View user's profileFind all posts by cpriestSend private message
Guru master

Joined: 24 Jul 2009
Posts: 716
Reply with quote
You can speed things up by disabling certain panels. That way, they don't need to refresh the values each time execution stops.

Right-click any of these panels and choose Disable to possibly improve performance:

    Globals
    Watch
    Locals

With Locals disabled, remember you can then use Watch to view the specific variables that you are interested in and Watch will even let you view specific array elements, etc.

Whilst it can be slow, bear in mind that I think the Nusphere DBG debugger is actually better in this respect than some other debuggers, because it uses the DBG Listener with a binary data transfer. Some (most? all?) other PHP debuggers use less efficient mechanisms.
View user's profileFind all posts by plugnplaySend private message


Joined: 25 Jul 2006
Posts: 70
Reply with quote
I'll give that a try and see if disabling the global will take care of it.

Is it indeed transferring everything in memory on every line? Wouldn't it be more efficient to just sent what's changed?
View user's profileFind all posts by cpriestSend private message
Guru master

Joined: 24 Jul 2009
Posts: 716
Reply with quote
I don't know how much it transfers. However, Dimitri has done things with PHP debugging that others have not (such as the setting execution point) so there is probably a good reason for it.

One consideration is the PhpED allows you to inspect variables at any point in the call stack. Maybe it is too complex at the PHP server end to see what variables have changed and easier to dump them all if you are showing Locals / Globals so PhpED can do the comparisons. Then if you'd like to choose which variables are transferred then you can use Watch. (I'm just guessing)

To be fair, I've only rarely had projects where I've seen slowness. Most sites give me a fairly instantaneous debug response.
View user's profileFind all posts by plugnplaySend private message


Joined: 25 Jul 2006
Posts: 70
Reply with quote
Yeah we are definitely not doing the norm I'm sure, we have daemon processes that have many megabytes of data loaded at different points, not just simple web pages.
View user's profileFind all posts by cpriestSend private message
Site Admin

Joined: 13 Jul 2003
Posts: 7875
Reply with quote
Quote:
Wouldn't it be more efficient to just sent what's changed?

No, it wouldn't.
To find what's changed would require an original copy of _all_ the values (->resources and time on copying) and extra loop through all the values comparing them. It won't work faster at these expenses.

_________________
The PHP IDE team
View user's profileFind all posts by dmitriSend private messageVisit poster's website


Joined: 25 Jul 2006
Posts: 70
Reply with quote
Hey Dmitri... I think it some cases it may, the situations where I am running into these problems is when I have several megabytes in memory and each 'step' it takes 10-15 seconds to get to the next line of code.

It does seem to be some what better with the global panel off but I'm not always debugging memory intensive processes.

As for the original copy mention, it already turns values red when they change, I take it this comparison is done on the GUI side? Perhaps it could be done from the debugger...
View user's profileFind all posts by cpriestSend private message
Site Admin

Joined: 13 Jul 2003
Posts: 7875
Reply with quote
Comparison in GUI is done against only VISIBLE values. In your case it may be a very small subset of the data.

As of server-side shadow copy of all values, maintaining them would resources - memory and CPU to make shadow copy itself, then compare it with current data. It won't work faster unless your network is really slow.

BTW, in most cases such problems with huge data happen when people are trying to fetch all or most of all data from a table (then picking just one row for example). They should try to optimize SQL to return only necessary data. This will relax php memory manager, sql server, and php debugger Smile

_________________
The PHP IDE team
View user's profileFind all posts by dmitriSend private messageVisit poster's website


Joined: 25 Jul 2006
Posts: 70
Reply with quote
In this particular case I have a process that is fetching 20k rows to be processed (out of millions) so fetching a smaller subset of the data would slow down execution overall to process these entries.
View user's profileFind all posts by cpriestSend private message
DBG Very Slow w/ Large Datasets in Memory
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
All times are GMT - 5 Hours  
Page 1 of 1  

  
  
 Reply to topic