[Archive!] Pure mathematics, physics, chemistry, etc.: brain-training problems not related to trade in any way - page 562
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
I really helped you. I removed all my posts about the SEG generator, I didn't even upload a video of the actual setup. So why do you need all this "orthogonal/vector mess" 8 pages long?
Yeah, well, your help is invaluable. I owe you one. I'm gonna have to crack.
I need it to increase the efficiency of the self-written optimizer - to inject genes orthogonal to the degenerate set into the population en masse. When the genetic algorithm starts to stall, it means that genes in it become potentially inclined to linear dependence (because the interbreeding is going on almost exclusively inside the set of "relatives"). Such an insertion (with subsequent crossing) may refresh the population with "new blood", and widen the search space, preventing the population from getting stuck in local lows.
// There are some more subtleties, but they're already too secretive. You'd better not insist. If I tell you about them, I'll have to kill you afterwards.
1. 1. to Mislaid, Mathemat,
And here and there everywhere the same thing - the same process, which I constructed yesterday myself. Consecutive subtraction of vector projections on previous orthos.
It's days like this that make me feel like a classic.... :-))
--
By the way, I already made and debugged the test script last night. Along the way, I found a bug in the pyramid optimizer and sent it to Service Desk. I bypassed the bug by slightly changing the code. So everything works. It's reliable and fast, just the way I needed it to be.
2. There really is one in OpenZL, but only for the three-dimensional case. [cross(a, b); builds vector orthogonal to two given ] I need it for arbitrary dimension.
Let's go on. Scalar product of two vectors a[] and b[] is sum of products a[i]*b[i]*w[i], where w[i] is weight function. Depending on what weights we give, we get solutions of different problems, which are obtained by the universal algorithm of sequential orthogonalization. (By the way, the above example constructs an orthogonal projection on a subspace stretched over arbitrary vectors.) In the case w[i] = 1 it is the scalar product of two vectors in Cartesian space.
If you set w[i] = r[i]*s[i], where
s[i] = 0.5/n, where i = 0, n;
s[i] = 1/n, at 0 < i < n;
Then the scalar product is defined as the integral of the product of the functions a(x)*b(x)*r(x) on the interval [0;1], expressed in finite differences.
If this is legal, then we can easily build any regression, naturally, in finite differences without any stress.
Only it seemed to me that this was a dead-end road. And I passed it.
Well, it means only one thing - the relative error of approximation is bigger, the smaller X (and Y) is, but what do you expect, dividing a small number by another small number? Try changing the variable X' = X+100 and plot a new series in the range from 100 to 400, not from 0 to 300 - the graph will be much straighter, but it won't change the matter
1. Go on. Scalar product of two vectors a[] and b[] is the sum of products a[i]*b[i]*w[i], where w[i] is a weight function. Depending on what weights we give, we get solutions of different problems, which are obtained by the universal algorithm of sequential orthogonalization. (By the way, the above example constructs an orthogonal projection on a subspace stretched over arbitrary vectors.) In the case w[i] = 1 it is the scalar product of two vectors in Cartesian space.
If we set w[i] = r[i]*s[i], where
s[i] = 0.5/n, at i = 0, n;
s[i] = 1/n, at 0 < i < n;
Then the scalar product is defined as the integral of the product of the functions a(x)*b(x)*r(x) on the interval [0;1], expressed in finite differences.
If this is legal, you can easily build any regression, of course, in finite differences without any stress.
2. Only it seemed to me that this was a dead-end road. And I passed it.
1. Sergey, it is too early for me to go further. When I get a better handle on Cartesian space, I will get into functional space. But it is an interesting topic, thank you for the post. You'll laugh, but it turned out to be quite informative for me.
2. There must be some doubts about the deadlock, since you're proposing further... :) If anything, I'll know who to choose as a guide on this "dead-end" path. I'm serious, if you have any questions on the subject, I'll ask. Do you mind?
It is necessary to increase the efficiency of self-written optimizer - to inject massively into the population genes orthogonal to the degenerate set. When the genetic algorithm starts to stall, it means that the genes in it become potentially prone to linear dependence (as the crossing is going on almost exclusively inside the set of "relatives"). Such an insertion (with further crossing over), may refresh the population with "new blood", and widen the search space, by preventing getting stuck in local lows.
You should have asked me first, before looking for orthogonal multidimensional vectors.... :)
Would have saved you time. Because it won't help you, I mean, you don't need it at all (I mean orthogonal vectors).
You should have asked me first before looking for orthogonal multidimensional vectors.... :)
It would have saved you time. Because it won't help you, I mean, you don't need it at all (I mean orthogonal vectors).
I don't buy it. I bet you just keep them under your pillow and don't show them to anyone.
Or you're trying to extort secret subtleties after all. (A variant of twisted suicide.)
;)
I don't believe it. I bet you just keep them under your pillow and don't show them to anyone.
;)
what a load of nonsense.
medieval Arabic numerals are very close to the form in which they were borrowed by Europeans from more developed nations along with positional notation.