You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Open Neural Network Exchange (ONNX): PyTorch to Tensorflow Demo - Tutorial 11
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 11:41
Artificial Intelligence: Mankind's Last Invention
The video "Artificial Intelligence: Mankind's Last Invention" explores the advancements and potential risks associated with developing artificial intelligence (AI). The video highlights Google DeepMind's AlphaGo, which surpassed centuries of human strategy knowledge in only 40 days. It dives into the differences between weak and strong AI and discusses how advanced AI can lead to a technological singularity, where it improves upon itself continuously and becomes billions of times smarter than humans. The speaker emphasizes the importance of giving AI human-like values and principles and cautions against creating an uncontrollable system. The video concludes by stressing the need to carefully consider the consequences of developing super intelligent AI before doing so.
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 11:42
Artificial Intelligence | Machine Learning | Documentary | Canadian Economy | Robots | Robotics | AI
The documentary discusses the artificial intelligence revolution in Canada, and how the nation has become one of the great AI superpowers, thanks to the invention of AI. The video examines the journey of professor Chef Hinton in modeling the brain and developing multi-layered deep neural networks that can solve problems that were previously unsolvable. The potential dangers of AI and machine learning are also explored, including the ability to create fake audio using someone's voice, and the use of reinforcement learning and robotics to develop autonomous entities that contribute to society in meaningful ways. Despite the potential for job loss and inequality, some argue that humans will eventually become intelligent machines themselves, continuing to evolve and create a new generation of humanity. The documentary also highlights the unknown nature of AI and the unpredictability of the future as technology continues to change everything.
Caltech's Machine Learning Course - CS 156. Lecture 01 - The Learning Problem
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:02
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 01 - The Learning ProblemThe first lecture of Yaser Abu-Mostafa's machine learning course introduces the learning problem, which is the process of finding patterns in data to make predictions without human intervention. He explains the need for mathematical formalization to abstract practical learning problems and introduces the first algorithm for machine learning in the course, the perceptron model, which uses a weight vector to classify data points into binary categories. The lecture also covers different types of learning, including supervised, unsupervised, and reinforcement learning, and presents a supervised learning problem to the audience to address the issue of determining a target function for learning.The professor covers various topics related to machine learning. He emphasizes the need to avoid bias when selecting data sets, as well as the importance of collecting a sufficient amount of data. The professor also discusses the role of the hypothesis set in machine learning and the impact of the choice of error function on the optimization technique. He also touches on the criteria for including machine learning methods in the course and his focus on providing practical knowledge rather than pure theory.
Caltech's Machine Learning Course - CS 156. Lecture 02 - Is Learning Feasible?
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:04
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 02 - Is Learning Feasible?
The lecture discusses the feasibility of learning, specifically the use of machine learning in determining patterns from given data. The lecturer introduces the concept of nu and mu in probability and how it relates to the learning problem. The addition of probability is explored, enabling the feasibility of learning without compromising the target function, meaning no assumptions need to be made about the function that will be learned. The concept of overfitting and how it relates to model sophistication is discussed, with a larger number of hypotheses leading to poorer generalization. Ultimately, the lecture concludes with a request to review the slide on the implication of nu equals mu.
Caltech's Machine Learning Course - CS 156. Lecture 03 -The Linear Model I
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:07
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 03 -The Linear Model I
This lecture covers the topics of linear models in machine learning, input representation, the perceptron algorithm, the pocket algorithm, and linear regression, including its use in classification. The professor emphasizes the importance of using real data to try out different ideas and introduces the concept of features to simplify the learning algorithm's life. The lecture also discusses the computational aspects of the pseudo-inverse in linear regression, and the problems that can arise when using linear regression for classification on non-separable data. Finally, the concept of using nonlinear transformations to make data more linear is presented, with an example demonstrating how to achieve separable data using the transformation x1² and x2² from the origin.
Also the professor covers various topics related to the linear model in machine learning. He discusses nonlinear transformations and guidelines on selecting them, in-sample and out-of-sample errors in binary classification, using linear regression for correlation analysis, and deriving meaningful features from input. The professor also emphasizes the importance of understanding the distinction between E_in and E_out and how they impact model performance. Lastly, he touches on the relationship between linear regression and maximum likelihood estimation, the use of nonlinear transformations, and the role of theory in understanding machine learning concepts.
Caltech's Machine Learning Course - CS 156. Lecture 04 - Error and Noise
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:09
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 04 - Error and Noise
In Lecture 04 of the machine learning course, Professor Abu-Mostafa discusses the importance of error and noise in real-life machine learning problems. He explains the concept of nonlinear transformation using the feature space Z, which is essential in preserving linearity in learning. The lecture also covers the components of the supervised learning diagram, emphasizing the importance of error measures in quantifying the performance of the hypothesis. Noisy targets are introduced as a typical component of real-world learning problems, which must be considered when minimizing the in-sample error. The lecture ends with a discussion on the theory of learning and its relevance in evaluating the in-sample error, out-of-sample error, and model complexity.
The professor explains how changes in the probability distribution can affect the learning algorithm and how error measures can differ for different applications. He also discusses the algorithm for linear regression, the use of squared error versus absolute value for error measures in optimization, and the tradeoff between complexity and performance in machine learning models. The professor clarifies the difference between the input space and feature extraction and notes that the theory for how to simultaneously improve generalization and minimize error will be covered in the coming lectures.
Caltech's Machine Learning Course - CS 156. Lecture 05 - Training Versus Testing
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:11
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 05 - Training Versus Testing
In Lecture 5 of his course on Learning From Data, Professor Abu-Mostafa discusses the concepts of error and noise in machine learning, the difference between training and testing, and the growth function, which measures the maximum number of dichotomies that can be produced by a hypothesis set for a given number of points. He also introduces the break point, which corresponds to the complexity of a hypothesis set and guarantees a polynomial growth rate in N if it exists, and discusses various examples of hypothesis sets such as positive rays, intervals, and convex sets. The lecture emphasizes the importance of understanding these concepts and their mathematical frameworks in order to fully comprehend the complexity of hypothesis sets and their potential for feasible learning.
The professor covered various topics related to training versus testing. He addressed questions from the audience about non-binary target and hypotheses functions and the tradeoff of shattering points. The professor explained the importance of finding a growth function and why it is preferred over using 2 to the power of N to measure the probability of generalization being high. Additionally, he discussed the relationship between the break point and the learning situation, noting that the existence of the break point means that learning is feasible, while the value of the break point tells us the resources needed to achieve a certain performance. Finally, the professor explained the alternatives to Hoeffding and why he is sticking to it to ensure people become familiar with it.
Caltech's Machine Learning Course - CS 156. Lecture 06 - Theory of Generalization
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:13
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 06 - Theory of Generalization
The lecture discusses the theory of generalization and the growth function as the number of dichotomies that can be generated by a hypothesis set on a set of N points, with the goal being to characterize the entire growth function and generalize for every N by characterizing the break point. The speaker demonstrates the process of computing the growth function for different hypothesis sets and proving the upper bound for the growth function using combinatorial identity. The discussion also touches on using the growth function in the Hoeffding inequality, the VC bound to characterize overlaps between hypotheses and the Vapnik-Chervonenkis inequality, which is polynomial in N with the order of the polynomial decided by the break point.
The professor discusses the theory of generalization, clarifying previous points and explaining the concept of a break point, which is used to calculate resources needed for learning. The focus of learning is on approximation to E_out, not E_in, allowing the learner to work with familiar quantities. The professor also explains the reasoning behind replacing M with the growth function and how this is related to the combinatorial quantity B of N and k. While discussing regression functions, the professor emphasizes the bias-variance tradeoff and how learnability is independent of the target function. Finally, the professor notes that the same principles apply to all types of functions.
Caltech's Machine Learning Course - CS 156. Lecture 07 - The VC Dimension
Forum on trading, automated trading systems and testing trading strategies
Machine Learning and Neural Networks
MetaQuotes, 2023.04.07 12:15
Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa
Caltech's Machine Learning Course - CS 156. Lecture 07 - The VC Dimension
The lecture introduces the concept of VC dimension, which is the maximum number of points that can be shattered by a hypothesis set, and explains its practical applications. The VC dimension represents the degrees of freedom of a model, and its relationship to the number of parameters in a model is discussed. Examples are given to demonstrate how to compute the VC dimension for different hypothesis sets. The relationship between the number of examples needed and the VC dimension is explored, and it is noted that there is a proportional relationship between the two. The implications of increasing the VC dimension on the performance of a learning algorithm are also discussed. Overall, the lecture provides insights into the VC theory and its practical implications for machine learning.
Also the video covers the concept of generalization and the generalization bound, which is a positive statement that shows the tradeoff between hypothesis set size and good generalization in machine learning. The professor explains the VC dimension, which is the largest value before the first break point, and how it can be used to approximate the number of examples needed. He notes the importance of choosing the correct error measure and clarifies that the VC dimension estimate is a loose estimate that can be used to compare models and approximate the number of examples needed. The lecture ends by highlighting the commonalities between this material and the topic of design of experiments and how the principles of learning extend to other situations beyond strict learning scenarios.