There is nothing wrong with have a 100 million row table. If your DB can't handle that then there is a problem. The problem comes in when you try to query that 100 million row table with a query that is poorly written, retrieves too much data or queries on things that aren't optimized. That is where your DBA comes in. Have them run the query profiler while you run your query. That will identify what is slow in your query. Then have the DBA optimize that.
Without seeing your query(ies) that is/are slow and understanding the table(s) structure it is going to be very hard to provide any sort of guidance. As an example, if you're using a WHERE clause on a column that isn't indexed then it will probably be slow. If you're running a WHERE clause on a column that is indexed but you're using a UDF or something to filter then an index is not going to be as useful. If you're joining across tables but your joins end up pulling a lot of rows from differing tables then your query will suffer irrelevant unless you filter down to the subset you actually need. Again, without the query(ies) and understanding of your DB structure it is hard to say what is wrong.