There is one thing you can try to with the current solution and that is to change
DECLARE FileFeed CURSOR FOR
to
DECLARE FileFeed CURSOR STATIC LOCAL FOR
The default cursor type is a dynamic cursor, which means that the cursor is evaluated for every FETCH. With a STATIC cursor, the query is executed once and the result is stored in a hidden temp table and the cursor is served from table. It is actually not uncommon to see code with loops where the slowest moment is to find the next row.
But more likely it is as Bruce says, the issue is inside the stored procedure. So if making the cursor static does not save the show, you will need to roll up the sleeves and start tuning.
The first thing is to see if there is any particular statement in your procedures that are slow. When you run things in a loop like this a scan over an non-indexed temp table with a modest number of 30000 rows can be a killer, because it happens again and again.
On my web site, you find the tool sp_sqltrace, written Lee Tudor. It's a really great tool to find bottlenecks in loops. But in the end, the long-term solution is to re-write these procedures so that they can handle sets of data. Or maybe rather writing new procedures, and keeping the old ones as wrappers on the old one, if there is code that need to be able to call these procedures by ID. And, no, this is by no means a simple task. How difficult it is, I don't know, but I've made this task with some long stored procedures, and it was certainly not a walk in the park. Not the least because you need a complete different mindset for a set-based solution than a scalar solution.