Limiter goes wild!
Then switched tables, and limiter goes up to 3500 or 4000.
Hey! stay at 35 :D
When I changed this in the front screen, it stays at 35 but limit checkmarks dissapears...
And then for some tables it went up to 4000 again..
Something wrong here
I usually never work with a default though, but just let heidi remember the last number i entered. this works fine.
See also my related commit-notes:
http://heidisql.svn.sourceforge.net/viewvc/heidisql?view=rev&revision=399
that does solve my fears .. but it also introduces a new problem
i just tested on a database with a table that contains a field with (a lot of) BLOB data. It now only shows 1 row when i want to browse the data. Which is quite annoying, because it seems there's no way around it anymore.
Anyways, i was already thinking of suggesting a feature that allows you to browse a table with blob data, without that (blob) data actually being transfered (which you can switch on before clicking on the Data tab of course). This would also solve that problem
(in this particular case i've stored the contents of a PDF file in that table. that data is useless to me while i'm browsing the other data in the table and i imagine this goes for all instances where the data is not an image or text. so it would be very helpful and really spead up things if it could be ignored)
whatever i did, i kept getting 1 record .. but then my eye fell on the filter .. Heidi apparently remembered the filter-setting from yesterday (even though i've closed and opened heidi after that many times). i didn't know the filter worked that way! Anyways, that's what limited the amount of rows al along, and not the 5MB treshold :)
anyways, i was quite impressed how heidi could know on beforehand how much 5 mb of data was .. i first assumed it would make a calculated guess based on the field-types. But since it seemed to work with the table with blob data (which can be both really small and really big) i suspected that it was probably possible to tell mysql to limit it to a certain amount of data.
I am now guessing again that my first thought is closer to the truth.. because now that i've removed the filter, clicking on the Data tab (for browsing the data), it retrieves all 294 records in the table, that make up 56MB of data in total. So, it doesn't limit it to the 5MB that it should.
The limit is not really 5MB but 5000 records. .. i'm noticing that whenever i open a table with more then 5000 records, Heidi is automatically checking the limit and setting it to 5000 to prevent all the rows from being loaded. When i then click on a table with less rows than 5000, the check-mark disappears again.
But this is clearly interfering with what i want as a user. Once i put the check-mark there with an explicit mouse-click it should stay there, regardless of what table data i start browsing. And it should also remember the amount of rows i'm limiting it too (and preferably between sessions (opening and closing of the program), like mysqlfront 2.5 did). These two things (the checkmark and the amount) now get lost and are overriden by Heidi's own behaviour
HS calculates the rowsize of your table by fetching the average rowsize which is shown by the SHOW TABLES LIKE 'xyz' statement and extrapolates it to a fitting number of rows.
The only thing about which I'm unsure is if the checkbox-state is reset after you have manually checked or unchecked it. We will check that :)
The limit is by default 5 MB.
HS calculates the rowsize of your table by fetching the average rowsize which is shown by the SHOW TABLES LIKE 'xyz' statement and extrapolates it to a fitting number of rows.
Hmm..it doesn't seem to work quite like that
example:
i have an ip2country table with 65291 rows, a Data_length of 1971544 and an avg_row_length of 30. If Heidi would limit it to 5MB she should return 5000000 / 30 = 16666 rows. But when i click on Data it says the results are limited to 5.000 rows.
another example:
that table with the PDFs in it has has 294 rows, Data_length of 56873984 and an avg_row_length of 193448. So, Heidi should limit it to 5000000 / 193448 = 25 rows. But when i click on Data, the results are not limited at all: i have all the rows on my screen (which takes quite long)
The first case i don't mind so much .. cuz 5000 rows is more then enough. But the last one is more of a problem, because 56MB are being transfered everytime i want to browse the data
The only thing about which I'm unsure is if the checkbox-state is reset after you have manually checked or unchecked it.
It works as it does by intention. It was made to work in a way where you can never click a table and have HeidiSQL suddenly hang or crash itself or your pc.
Concretely, it recalculates whether it's necessary to cap the number of rows fetched when you switch tables - if HeidiSQL finds that you'd exceed the threshold (5 MB currently), a limit is applied.
It's hard to imagine that anyone would want it any other way, because that would cause HeidiSQL to hang or crash (see tracker for old issues) whenever you switch to a table with large rows. But then again, users ARE always smarter than developers, so perhaps it should be changed to be less intelligent and just crash HeidiSQL like it did in the not-so-old days, ho hum.
...blah... like mysqlfront 2.5 did ...blah...
Comparisons with MySQL-Front 2.5 are irrelevant, since it used USE_RESULT mode for fetching SQL rows, which made the memory issue largely disappear until you browsed to the last record. USE_RESULT came with it's own set of bugs, so that's not being introducing into HeidiSQL.
What could be done:
It would be neat to fetch rows as needed by sending the server a LIMIT x,y statement whenever the scroll bar is dragged. I'm not sure how to integrate that with Zeos, but it would be neat, and it would completely remove the LIMIT options from the toolbar and preferences, rendering this whole discussion moot.
This point has also been mentioned in the issue tracker, so if you had taken time to check for previous discussion, there you would have it.
Setting the limiter from 0-35 records (in the Preferences menu). Saved.
Then switched tables, and limiter goes up to 3500 or 4000.
Hey! stay at 35
Yep, I think that's a bug. There should be a "0 = auto" setting in preferences. If the user really wants a fixed limit, (s)he should be able to set it in prefs, disabling the automatic limiter.
And since the inner working of the automatic limiter is clearly confusing to users: when the preference is set to 0 / automatic, the limit fields should be grayed out and uneditable. The check box itself should still be available, to turn the limiter off after browsing into individual tables.
...blah... it doesn't seem to work quite like that ...blah...
Well, no. It's rounded to nearest thousand, and an internal overhead per row is taken into consideration, for example. Consult the source code if you're eager to see the details. It's a highly imprecise calculation, but it does beat crashing your pc by running it out of memory.
It's hard to imagine that anyone would want it any other way, because that would cause HeidiSQL to hang or crash (see tracker for old issues) whenever you switch to a table with large rows. But then again, users ARE always smarter than developers, so perhaps it should be changed to be less intelligent and just crash HeidiSQL like it did in the not-so-old days, ho hum.
This option does have it's advantages, but only if it would remember when i override it using the checkmark. Right now, that is not happening.
If i'm on a crappy connection to an external database i don't want Heidi to give me 5000 rows worth 5MB of data, while i've told it before i prefer to browse with a limit of 20.
Comparisons with MySQL-Front 2.5 are irrelevant
...blah... like mysqlfront 2.5 did ...blah...
... is that like cursing in the church now? :)
i'm sorry for that then. i was only making the comparison to illustrate certain behaviour. Ok without reference then :
It would be nice if Heidi would remember the last limitation that a user sets. So if i set it to 40 and close the program that it's still 40 when i start it again.
...blah... it doesn't seem to work quite like that ...blah...
Well, no. It's rounded to nearest thousand, and an internal overhead per row is taken into consideration, for example. Consult the source code if you're eager to see the details. It's a highly imprecise calculation, but it does beat crashing your pc by running it out of memory.
I'm getting a bit the feeling i somehow i offended you? I'm sorry if i did
I'm just a user that knows SQL and that's reporting what seems a bug in my eyes... I'm only being thorough in that report because i know how frustrating it can be when someone reports simply: 'it doesn't work as expected'. .. so i was just elobarting on my findings. Looking at Delphi code would not get me nowhere.
Anyways, you seem to have missed the point of Heidi fetching 56MB of data, instead of limiting it to 5. That's clearly a bug in my eyes.
What's worse, if i limit the data myself, Heidi suddenly jumps back to limiting it herself again, ignoring my setting, and fetching the 56MB of data again.
Please login to leave a reply, or register at first.