How do you work with neural networks? - page 8

 
VladislavVG:

What do you mean by "optimisation"? If it's just going through the options, that's not really what it is. It is MT that confuses you.

Now about GA: it is a search method, in case of training network we are looking for minimum of some functional. Often mistakes are made. In the process of network training both ORO and GA, and gradients and annealing (there is such a method - similar to GA) try to find an extremum. Which method will be more effective depends on the functional and quality criterion (i.e. criterion according to which the best variants are selected). GA is the most universal of all methods. None of them guarantees finding the global extremum.

With use of GA it is possible, for example, to simultaneous selection of network architecture, i.e. to include it (architecture) into optimized parameters and set quality criterion (fitness function in terms of GA). There are more possibilities. And if necessary you may use ORO together with GA as well.

Good luck.


You've already answered yourself that GA is not a neural network. Might as well be a gradient method as easily a NS. There is a car and a driver. And there are a bunch of ways to teach a driver to drive a car, but each of those ways is not a car.

This is also what Swetten claims. I don't really understand what you are arguing about?

 
Farnsworth:

You yourself answered that GA is not a neural network. The gradient method might as well be an easy NS. There is a car and a driver. And there are plenty of ways to teach the driver how to drive a car, but each of those ways is not a car.

This is also what Swetten claims. I don't really understand what you're arguing?

So I didn't claim that GA is NS. I showed how NS and GA could be connected in response to Svetlana's phrase,

Напомню изначальное утверждение: НС и ГА -- совершенно разные вещи, между собой никак не связанные.

That there is no such connection.


Good luck.
 
VladislavVG:

So I didn't claim that GA is NS. I showed how NS and GA could be related in response to Svetlana's phrase that there is no such thing.

1. You didn't read the thread carefully and didn't understand what the phrase was referring to;

2. The funniest thing is that they really have nothing to do with it. If you add GA, it is one thing; if you add ORO, it is another thing; if you add something else, it is another thing.

3. GA is just an optimization mechanism. It is a universal mechanism, I should point out. I mean, it is a universal mechanism, including NS optimization.

As for points 2 and 3, I conclude that the NS and GA have nothing to do with each other.

 

NS - transformation method (whatever: approximation, classification, filtering, logical transformation)

GA - optimization method.

One is not the other and cannot replace one another. And basta.

In many articles and books on NS, talking about artificial neural networks, they imply that they are trained by ORO, misleading the readers. Not only that, I've encountered statements that "such networks don't work because...." while if a network is trained with some other optimization algorithm, it is already a completely different network attributing completely different qualities to such "other" networks - a sheep's piss. A network is a network, it will not change its properties if we train it by any method of optimization. It may change the quality of training, that is all.