Greetings to all members! I invite all experts to discuss this topic.
One of my colleagues mentioned that he used tags on the site and that because of them his site came under the filter the way the use of tags led to duplication of the text. So the main question is: is the use of tags really so harmful to the site in terms of its promotion and how best to do it: do not use tags at all or deny them in the robots.txt file? Please share your opinion on this!
And this is the "white hat" or "black hat" question?
In this case it is a white hat project - topics: health food sport. Similarity of the news portal.
For white hat - I recommend closing tags from indexing and making them non-sticky, and use for more accurate grouping of content.
For black hat - This is the basis for the generation and uniqueness of content (text with one tag + paragraph with such a tag + feed of a twitter on such and such a tag).
Previously, it led to a duplication, + weight was going nowhere + a great chance to get penalized for content duplication. Now search engine bots can easily them but from my point of view, they are not needed.