Dmitriy Gizlyk
Dmitriy Gizlyk
4.4 (50)
  • Information
11+ years
experience
0
products
0
demo versions
134
jobs
0
signals
0
subscribers
Professional programming of any complexity for MT4, MT5, C#.
Dmitriy Gizlyk
Published article Нейросети в трейдинге: Повышение эффективности Transformer путем снижения резкости (Окончание)
Нейросети в трейдинге: Повышение эффективности Transformer путем снижения резкости (Окончание)

SAMformer предлагает решение ключевых проблем Transformer в долгосрочном прогнозировании временных рядов, включая сложность обучения и слабое обобщение на малых выборках. Его неглубокая архитектура и оптимизация с учетом резкости обеспечивают избегание плохих локальных минимумов. В данной статье мы продолжим реализацию подходов с использованием MQL5 и оценим их практическую ценность.

1
Dmitriy Gizlyk
Published article Нейросети в трейдинге: Повышение эффективности Transformer путем снижения резкости (SAMformer)
Нейросети в трейдинге: Повышение эффективности Transformer путем снижения резкости (SAMformer)

Обучение моделей Transformer требует больших объемов данных и часто затруднено из-за слабой способности моделей к обобщению на малых выборках. Фреймворк SAMformer помогает решить эту проблему, избегая плохих локальных минимумов. И повышает эффективность моделей даже на ограниченных обучающих выборках.

1
Dmitriy Gizlyk
Published article Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention)
Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention)

The LSEAttention framework offers improvements to the Transformer architecture. It was designed specifically for long-term multivariate time series forecasting. The approaches proposed by the authors of the method can be applied to solve problems of entropy collapse and learning instability, which are often encountered with vanilla Transformer.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Hyperbolic Latent Diffusion Model (Final Part)
Neural Networks in Trading: Hyperbolic Latent Diffusion Model (Final Part)

The use of anisotropic diffusion processes for encoding the initial data in a hyperbolic latent space, as proposed in the HypDIff framework, assists in preserving the topological features of the current market situation and improves the quality of its analysis. In the previous article, we started implementing the proposed approaches using MQL5. Today we will continue the work we started and will bring it to its logical conclusion.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Hyperbolic Latent Diffusion Model (HypDiff)
Neural Networks in Trading: Hyperbolic Latent Diffusion Model (HypDiff)

The article considers methods of encoding initial data in hyperbolic latent space through anisotropic diffusion processes. This helps to more accurately preserve the topological characteristics of the current market situation and improves the quality of its analysis.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Directional Diffusion Models (DDM)
Neural Networks in Trading: Directional Diffusion Models (DDM)

In this article, we discuss Directional Diffusion Models that exploit data-dependent anisotropic and directed noise in a forward diffusion process to capture meaningful graph representations.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Node-Adaptive Graph Representation with NAFS
Neural Networks in Trading: Node-Adaptive Graph Representation with NAFS

We invite you to get acquainted with the NAFS (Node-Adaptive Feature Smoothing) method, which is a non-parametric approach to creating node representations that does not require parameter training. NAFS extracts features of each node given its neighbors and then adaptively combines these features to form a final representation.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Contrastive Pattern Transformer (Final Part)
Neural Networks in Trading: Contrastive Pattern Transformer (Final Part)

In the previous last article within this series, we looked at the Atom-Motif Contrastive Transformer (AMCT) framework, which uses contrastive learning to discover key patterns at all levels, from basic elements to complex structures. In this article, we continue implementing AMCT approaches using MQL5.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Contrastive Pattern Transformer
Neural Networks in Trading: Contrastive Pattern Transformer

The Contrastive Transformer is designed to analyze markets both at the level of individual candlesticks and based on entire patterns. This helps improve the quality of market trend modeling. Moreover, the use of contrastive learning to align representations of candlesticks and patterns fosters self-regulation and improves the accuracy of forecasts.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Market Analysis Using a Pattern Transformer
Neural Networks in Trading: Market Analysis Using a Pattern Transformer

When we use models to analyze the market situation, we mainly focus on the candlestick. However, it has long been known that candlestick patterns can help in predicting future price movements. In this article, we will get acquainted with a method that allows us to integrate both of these approaches.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Transformer with Relative Encoding
Neural Networks in Trading: Transformer with Relative Encoding

Self-supervised learning can be an effective way to analyze large amounts of unlabeled data. The efficiency is provided by the adaptation of models to the specific features of financial markets, which helps improve the effectiveness of traditional methods. This article introduces an alternative attention mechanism that takes into account the relative dependencies and relationships between inputs.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Controlled Segmentation (Final Part)
Neural Networks in Trading: Controlled Segmentation (Final Part)

We continue the work started in the previous article on building the RefMask3D framework using MQL5. This framework is designed to comprehensively study multimodal interaction and feature analysis in a point cloud, followed by target object identification based on a description provided in natural language.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Controlled Segmentation
Neural Networks in Trading: Controlled Segmentation

In this article. we will discuss a method of complex multimodal interaction analysis and feature understanding.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Generalized 3D Referring Expression Segmentation
Neural Networks in Trading: Generalized 3D Referring Expression Segmentation

While analyzing the market situation, we divide it into separate segments, identifying key trends. However, traditional analysis methods often focus on one aspect and thus limit the proper perception. In this article, we will learn about a method that enables the selection of multiple objects to ensure a more comprehensive and multi-layered understanding of the situation.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Mask-Attention-Free Approach to Price Movement Forecasting
Neural Networks in Trading: Mask-Attention-Free Approach to Price Movement Forecasting

In this article, we will discuss the Mask-Attention-Free Transformer (MAFT) method and its application in the field of trading. Unlike traditional Transformers that require data masking when processing sequences, MAFT optimizes the attention process by eliminating the need for masking, significantly improving computational efficiency.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Superpoint Transformer (SPFormer)
Neural Networks in Trading: Superpoint Transformer (SPFormer)

In this article, we introduce a method for segmenting 3D objects based on Superpoint Transformer (SPFormer), which eliminates the need for intermediate data aggregation. This speeds up the segmentation process and improves the performance of the model.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Exploring the Local Structure of Data
Neural Networks in Trading: Exploring the Local Structure of Data

Effective identification and preservation of the local structure of market data in noisy conditions is a critical task in trading. The use of the Self-Attention mechanism has shown promising results in processing such data; however, the classical approach does not account for the local characteristics of the underlying structure. In this article, I introduce an algorithm capable of incorporating these structural dependencies.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Scene-Aware Object Detection (HyperDet3D)
Neural Networks in Trading: Scene-Aware Object Detection (HyperDet3D)

We invite you to get acquainted with a new approach to detecting objects using hypernetworks. A hypernetwork generates weights for the main model, which allows taking into account the specifics of the current market situation. This approach allows us to improve forecasting accuracy by adapting the model to different trading conditions.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Transformer for the Point Cloud (Pointformer)
Neural Networks in Trading: Transformer for the Point Cloud (Pointformer)

In this article, we will talk about algorithms for using attention methods in solving problems of detecting objects in a point cloud. Object detection in point clouds is important for many real-world applications.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Hierarchical Feature Learning for Point Clouds
Neural Networks in Trading: Hierarchical Feature Learning for Point Clouds

We continue to study algorithms for extracting features from a point cloud. In this article, we will get acquainted with the mechanisms for increasing the efficiency of the PointNet method.