Roadmap

Retrieval of related posts using LLMs (previously listed as “Retrieval of related posts by analyzing article content”

Although neither solution is currently available, internal tests have shown that using LLMs provides better performance and more accurate results compared to approaches based on analyzing article content. With this new feature, it will be possible to retrieve much more relevant and contextualized related posts based on the displayed content.

For example, two articles within the same category will be able to show different suggestions that are truly meaningful for each reader.

You will be able to integrate external services via API, such as OpenAI, or use a local model like Ollama. In the latter case, you will need to make your local installation accessible from the outside, for example by configuring a tunnel through simple solutions such as Ngrok, Cloudflare Tunnel, LocalTunnel, or Serveo.

This feature will be introduced during the next year.


Offset Option

With the offset option, you can exclude a specific number of related posts from the displayed list.


Using Related Posts from the Internal Linking of Related Contents Plugin

If the Internal Linking of Related Contents plugin is active, related posts will be retrieved using the same plugin settings. This ensures the site is not overloaded, maintaining optimal performance.


Automatic offset

An optional automatic offset is applied if the previously mentioned option is enabled.

If the Internal Linking of Related Contents plugin has already inserted 3 related posts directly into the content of an article, and there are 8 related posts in total, the remaining 5 posts will be displayed by automatically applying an offset of 3.

This ensures the related posts displayed outside the content do not duplicate those already included within it.