You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
ChatGPT AI Made Me A $100,000 TRADING STRATEGY
https://www.youtube.com/watch?v=Jh5rJskkEkU& ab_channel=HumbledTrader
Moving average trading on nasdaq stock, yes, the grail)
Moving average trading on nasdaq stock, yes, the grail)
Also such a screaming round sum
Why develop special "thinking mechanisms" if a statistical model based on textual data reproduces "thinking" mathematically?
Everyone knows that every sentence and every word of text data contains human thoughts, but not everyone knows that the statistical approach extracts and"weighs" chains of text representing thoughts (knowledge and attitudes of people) in order to "copy" interrelations and generate new chains (thoughts) according to mathematically verified patterns.
Any thought represented in a text is transformable into a chain of tokens. "Weighing" the chain on the basis of textual material, unites it by connections with the context group (chains or thoughts, in human understanding). That is, each chain has its own context space consisting of links to other chains. The common context space of all chains is the space of AI "thinking".
Why, in this case, develop the technology of "real", knowledge, feelings, relationships? Why not "copy" the products of human thinking from the text and claim the "reasonableness of AI"?
And what is "Reasonableness" in this context, as not a well-working and well-adjusted statistical model, which determines the best chances of survival and prosperity in the surrounding world?
Why develop special "thinking mechanisms" when a statistical model based on textual data, reproduces "thinking" mathematically?
...
He's still confused :)
Told him he was wrong ) He corrected himself ))
To date, short episodes of the "thinking" process are reproduced as part of the response to input prompts. Although it is clear that there is no "thinking" involved. At least not in the human sense.
How will you take the information that there is a chemical soup inside your head that has imagined what it thinks? ))
He's also smart ))
But he got better again ))
He's also smart ))
But he got better again ))
ask the same code in MQL6 ;-)