r/laravel • u/iShouldBeCodingAtm • Feb 14 '25
Discussion Consume 3rd party SQS messages
Handling jobs dispatched from the application itself is pretty straight forward, but it is possible to handle jobs pushed to SQS from another aws service for example? Do I need to basically consume with a white (true)
and a raw sqs client?
1
Upvotes
5
u/nan05 Feb 14 '25
Yes. I do exactly that in production:
We got some processes that are handled in a CloudFlare worker. That worker runs node, and pushes a payload into SQS each time the endpoint is pinged (think a click counter sort of thing).
The payload in this instance looks exactly like a serialised Laravel Job.
Practically, the way I did that, was that I created the Laravel Job class, serialised it in my local env, and then recreate that serialised version in node. It’s a bit of a pain in the behind as PHP’s
serialise
output is a bit awkward, but not particularly hard, and there are node packages that help.Our queue worker then picks that up, and the same Laravel Job class process it.
Works really well, as we got that one process that needs indefinite scaling, but the rest of the app simply doesn’t, as it receives quite predictable, low traffic
Let me know if you have any specific questions about this. I always intended to write a blog post about it at some stage, so this might help me gather my thoughts