What I was writing was more about avoiding to have to deal with 22 queries, which will read 22 times the same content, filter it (to different subsets) and then perform similar steps: I believe the costs of all those full-size readings are going to be rather prohibitive
Not really. Actually once you get your big CSV in Power Query (I bypass a couple of required steps) you Group that big table by xxx_CAB, yyy_CAB... In your case you end-up with 22 nested table, something like:
and load that query as a Connection only - this is a kind of staging query. Then, with VBA (or manually) you access each nested table to load it as a new query/table on a sheet
Hope this clarify things a bit. Now, if this is a one time exercise and you know how to do it with VBA that's probably the best approach as doing what I describe above and then write some VBA to iterate over each nested table or doing it manually will necessarily take more time than a VBA script
Any question or need for clarification let me know