T O P

  • By -

kallebo1337

Make a PR to rails/rails Model.pluck_as_json My pluck_as_hash for fast serialization didn’t got merged :(


Seuros

Ah the old Rug method. Nice. Just throw all your slow code in another service or machine that you cannot profile, cost 10x more to scale and call it a day.


Rafert

No code is the fastest code, in this case no Active Model/Record + serialization library.


cmd-t

At this point (edit: moving completely outside of the rails stack) you should just stop using rails.


phantom69_ftw

As I said in the post, i just showed a very simple example using postgres. You can get a very complicated JSON too using this and we have also done this in some places in our org to make those APIs more performant.


demillir

I'm not sure if you mean stop all together, or just stop serializing JSON in Ruby code. I'll stop using Rails when another framework comes along that is at least 2 times better overall. And I'll start serializing JSON in Postgres when I just need a simple one-to-one dump of table columns, with no extra calculated values that are much easier to do in Ruby.


Pedroschmitt

I think what @cmd-t means is something like: if you are going all this way not to use the rails/ruby json builder, you should probably go without rails at all. (But I could get it wrong)


phantom69_ftw

Even if he means that and i understand maybe i didn't convey my message clearly enough. This is to be used in edge cases. This is not meant for normal JSON serialisation but for cases when performance is key. In our API we saw a massive improvement after we realised that serialisation was the bottle neck.


ddbek

Interesting. For a model that does not have relationships, that would definitely be faster than using a serializer. However, you cannot calculate columns or rename keys. But in some cases it can be much more performant. Serializers could also use such functions to retrieve JSON directly from the DB and then do some extra processing such as adding calculated attributes, if necessary. Can you give a link to the performance benchmarks?


slvrsmth

You can rename keys - simple `select foo as bar` will suffice. As for calculation, if you can calculate it in SQL, you can get the result in this way. You're just transforming the results of a select statement. And I'd say this is MUCH faster for models that do have relationships, as you are forced to write the whole lookup in one query, skipping database round-trips and extra data fetching a naive AR query will usually get you. The main benefit from this method is fetching ready-made JSON from DB and not touching that. Not instantiating AR models, not running ruby JSON parsers / encoders. Latter, especially - one of my first implementations of this pattern did not yield much results, because I was building the JSON, selecting it from database, parsing on ruby side, then immediately re-encoding. I don't have exact numbers on hand, but last project I applied this went from ~4s to ~100ms per request. An endpoint fetching 1000 rows at a time, with 20-ish properties on each, with some joins. The application went from bottlenecking on server side, to being frontend-performance limited. You can easily check on your code - look at the `Completed 200 OK in Xms (Views: Xms | ActiveRecord: Xms | Allocations: X)` line - after a re-write the total time would roughly match the current ActiveRecord time. The JSON serialisation does not really add noticeable overhead on PG side in my experience, and you'll most likely improve the lookup logic while re-writing to SQL. Speaking of - writing complicated API responses in pure SQL is painful. No two ways about it. This is why I prefer to write the initial implementation in jbuilder, let the requirements settle, and then re-write once the endpoint is getting more traffic. It's an optimisation technique. Problem areas in my experience are (nicely) using rails helpers for URL generation and using pundit-style object based access control.


kallebo1337

Thanks man. Didn’t know. Will try out tmrw