DBG Very Slow w/ Large Datasets in Memory |
Guru master
|
You can speed things up by disabling certain panels. That way, they don't need to refresh the values each time execution stops.
Right-click any of these panels and choose Disable to possibly improve performance:
Watch Locals With Locals disabled, remember you can then use Watch to view the specific variables that you are interested in and Watch will even let you view specific array elements, etc. Whilst it can be slow, bear in mind that I think the Nusphere DBG debugger is actually better in this respect than some other debuggers, because it uses the DBG Listener with a binary data transfer. Some (most? all?) other PHP debuggers use less efficient mechanisms. |
||||||||||||
|
|
I'll give that a try and see if disabling the global will take care of it.
Is it indeed transferring everything in memory on every line? Wouldn't it be more efficient to just sent what's changed? |
||||||||||||
|
Guru master
|
I don't know how much it transfers. However, Dimitri has done things with PHP debugging that others have not (such as the setting execution point) so there is probably a good reason for it.
One consideration is the PhpED allows you to inspect variables at any point in the call stack. Maybe it is too complex at the PHP server end to see what variables have changed and easier to dump them all if you are showing Locals / Globals so PhpED can do the comparisons. Then if you'd like to choose which variables are transferred then you can use Watch. (I'm just guessing) To be fair, I've only rarely had projects where I've seen slowness. Most sites give me a fairly instantaneous debug response. |
||||||||||||
|
|
Yeah we are definitely not doing the norm I'm sure, we have daemon processes that have many megabytes of data loaded at different points, not just simple web pages.
|
||||||||||||
|
Site Admin
|
No, it wouldn't. To find what's changed would require an original copy of _all_ the values (->resources and time on copying) and extra loop through all the values comparing them. It won't work faster at these expenses. |
||||||||||||||
_________________ The PHP IDE team |
|
Hey Dmitri... I think it some cases it may, the situations where I am running into these problems is when I have several megabytes in memory and each 'step' it takes 10-15 seconds to get to the next line of code.
It does seem to be some what better with the global panel off but I'm not always debugging memory intensive processes. As for the original copy mention, it already turns values red when they change, I take it this comparison is done on the GUI side? Perhaps it could be done from the debugger... |
||||||||||||
|
Site Admin
|
Comparison in GUI is done against only VISIBLE values. In your case it may be a very small subset of the data.
As of server-side shadow copy of all values, maintaining them would resources - memory and CPU to make shadow copy itself, then compare it with current data. It won't work faster unless your network is really slow. BTW, in most cases such problems with huge data happen when people are trying to fetch all or most of all data from a table (then picking just one row for example). They should try to optimize SQL to return only necessary data. This will relax php memory manager, sql server, and php debugger |
||||||||||||
_________________ The PHP IDE team |
|
In this particular case I have a process that is fetching 20k rows to be processed (out of millions) so fetching a smaller subset of the data would slow down execution overall to process these entries.
|
||||||||||||
|
DBG Very Slow w/ Large Datasets in Memory |
|
||
Content © NuSphere Corp., PHP IDE team
Powered by phpBB © phpBB Group, Design by phpBBStyles.com | Styles Database.
Powered by
Powered by phpBB © phpBB Group, Design by phpBBStyles.com | Styles Database.
Powered by