Try to follow me here. This will be far from scientifically accurate, but I think contains a hint of logic
I have a 200watt heater and it seems to be the commonly recommended size. I have no issues keeping temp even with heat(house) turned way down.
It takes approx
8.4 BTUs to heat a gallon of water 1 degree(F). I don't think it matters how long it takes either. Of course factors like the surrounding temp matter, but for arguments sake, lets say the water being heated is at room temp.
5.7gal
25 watts=85 BTU's.
85/
8.4=10.11gal
I have a 55gal:
200watts=682 BTU's
682/
8.4=81.19gal
This is far from exact science

....
While you have a bigger buffer as far as percentage goes, I have 20 more gallons of buffer(of heating capacity).
Now the science gets even more exact lol. While my 55gal has more exposed glass(ie more heat loss potential), I would think a 5.7gal would have a larger percentage of water exposed to glass, therefore resulting in a greater heat loss per gallon. Does this make any sense lol? I'm totally theorizing here...
So, I'm thinking you are on the borderline?
I need a cup of coffee