Discussing the article: "Integrate Your Own LLM into EA (Part 2): Example of Environment Deployment"

 

Check out the new article: Integrate Your Own LLM into EA (Part 2): Example of Environment Deployment.

With the rapid development of artificial intelligence today, language models (LLMs) are an important part of artificial intelligence, so we should think about how to integrate powerful LLMs into our algorithmic trading. For most people, it is difficult to fine-tune these powerful models according to their needs, deploy them locally, and then apply them to algorithmic trading. This series of articles will take a step-by-step approach to achieve this goal.

WSL2 is a major upgrade to the original version of WSL launched by Microsoft as early as 2017. WSL2 is not just a version upgrade, it is faster, more universal, and uses a real Linux kernel. To this day, I believe that many people do not know the existence of WSL, including some IT practitioners. They are still continuing with Windows+VirtualMachine, or using dual system mode, switching from Windows when they need to use Linux.

Of course, it cannot be denied that some tasks may require a complete Linux environment, which WSL cannot replace, so these methods are not without reason.

For artificial intelligence, we mainly use Linux's powerful command line tools and GPU accelerated computing, so the configuration of dual systems or Windows+VirtualMachines seems a bit bloated. WSL2 itself has a complete Linux kernel, which is closer to the real Linux environment, and WSL2 can seamlessly dock with the Windows file system. You can manage its file system like managing a folder under Windows. In less than a second, you can run the Linux command line from Windows. So if you want to use Linux functions and tools efficiently and conveniently in the Windows environment, you will never regret choosing WSL2.

Author: Yuqiang Pan

 
Very good articles regarding LLM.  Is there part3 regarding LLM?
 
williamwong #:
Very good articles regarding LLM.  Is there part3 regarding LLM?

Thank you very much for your recognition. Yes, will come soon!

 
Will you consider pre-service LLM training with your examples? Thanks for the article.
Reason: