Efficient Indexation is key in order to get on well with Google's spiders. Many believe the more pages indexed the better; that is only true to a certain level. If you have many non-valuable, not-visited or duplicate pages indexed for example, the bot will likely limit your crawling credit and discard those pages which you value most.
Make sure all the important pages are indexed and that those you do not want indexed are not. You can check the number of indexed pages in the Search Console, and also use Google commands with site:yoursite.com to check which pages are indexed.
Check on duplicate or irrelevant indexed pages and work on a plan to have those pages de-indexed. The best way is to return a 410 status codes for those unwanted pages till they are de-indexed.
Check on your redirects, make sure you are using the correct status codes. Make sure you are limiting the number of redirects, if a big majority of the pages Google is crawling are redirects, or worse chains of redirects, your site will be negatively impacted by this.
While you are auditing your indexation, check for metas. Are there any description /titles missing? Are they unique and all make sense targeting specific keywords which make sense to your overall strategy? It is not recommended to change url structure, but make sure you have a readable url structure, with as little special characters and numbers as possible. Preferably your URLs should be descriptive and as unique as possible. ,