Indexing a site by search engines is nothing more than the inclusion of pages of an Internet resource in the search results. Depending on the content and its uniqueness, as well as on the settings of the robots.txt file, some sites are indexed faster, others slower.
When creating new sites and filling them with a certain interval of time, search engines try to independently load the cache of the pages of these sites and add them to the search results. Sometimes it takes up to 2-3 weeks. In order for the site to be indexed as soon as possible, you need to add it to the webmaster's panel or to AddURL, which almost all search engines have.
You can add a site to Google indexing by following the link https://www.google.com/webmasters/tools/. Yandex also has its own Yandex. Webmaster interface located at the link https://webmaster.yandex.ru/. To add sites through the webmaster's panel, you need to create an account (email) with Google and Yandex. After adding a site in the webmaster's panels and checking for owner rights, your site will be added to the indexing queue
You can check how many pages a particular search engine has indexed by entering the "site:" operator (without quotes) and the site URL after a colon, without a space and https:// into the search. Such a search query returns links to all pages of the site, indexed and added to the catalog of the selected search engine, in which the search is carried out.
If the site has not been indexed for a month and 3 weeks or more, it is possible that the site has non-unique content - "copy-paste", or the robots.txt file is configured in such a way that prohibits search engines from processing site pages. You can read more about setting up your robots.txt file here: