Make sure all the important pages are indexed and that those you do not want indexed are not. You can check the number of indexed pages in the Search Console, and also use Google commands with site:yoursite.com to check which pages are indexed.
Check on duplicate or irrelevant indexed pages and work on a plan to have those pages de-indexed. The best way is to return a 410 status codes for those unwanted pages till they are de-indexed.
Check on your redirects, make sure you are using the correct status codes. Make sure you are limiting the number of redirects, if a big majority of the pages Google is crawling are redirects, or worse chains of redirects, your site will be negatively impacted by this.
While you are auditing your indexation, check for metas. Are there any description /titles missing? Are they unique and all make sense targeting specific keywords which make sense to your overall strategy? It is not recommended to change url structure, but make sure you have a readable url structure, with as little special characters and numbers as possible. Preferably your URLs should be descriptive and as unique as possible. ,
Crawling is directly related to indexing. To optimize indexing, you can guide Google on how to best crawl your site.
xml format sitemap guides Google on how to crawl your site. Although Google says there is no guarantee the Google bot will follow your instructions, it is still highly recommended and in most cases, sitemaps are beneficial. It contains useful information about each page on your site, including when a page was last modified; what priority it has on your site; how frequently it is updated.
Robots set instructions depending on the user agent which parts of the site can be accessed. Making sure you are not excluding the relevant search engine bots is therefore of course paramount.
You can also guide the Google bot in the code with tags for each page the most common ones being: follow/ nofollow and index/ noindex
You can directly submit urls to the Google index in the Google Search Console. This is particularly useful if you have crawling issues and there are some pages you want to have crawl and indexed in priority.
An important factor which links crawling and indexation is that for sites with many pages (indexed or not), there is a limit to how much your site will be crawled each time the Google bot visits your site. It is therefore important to keep an eye and understand which pages are indexed and why they need to be indexed (not all pages need to be indexed in particular in case of duplicate content)