What are fault conditions in Hopfield networks
Mock attracters in Hopfield networks
I think your intuition about the lower "energy ratio" of disturbance states that explain their greater susceptibility to unlearning may be correct.
In a Hopfield network, disturbance states are activity patterns that are not explicitly embedded in the synaptic matrix, but are nonetheless stable. In other words, they are "undesirable" attractor states that occur as a local minimum in the energy function due to a finite overlap with the "desired" attractor states. The unlearning rule in Hopfield et al. (1983) consists in modifying the synaptic matrix in such a way that the energy of the stable states in which the network dynamics are established is reduced, be they false or embedded states. Since the disturbance states have a higher energy than the embedded states, they are more affected by unlearning.
Why do disturbance states have a higher energy than the embedded attractor states? Well, this is generally not the case, but it is the case in a regime where the Hopfield network does not exceed its loading capacity, that is, when the number of patterns learned is above the number of units p / N. is lower than the critical capacity αc≤0.138. In this regime it is possible to estimate the overlap of the disturbance states with the learned patterns and to show that this is generally lower than 1 (the overlap of the learned patterns with themselves). Due to the Hebrew construction of the synaptic matrix in the Hopfield model, these overlaps are terms that occur in the energy function. Namely, the energy of a pattern is proportional to minus the square root of its overlap with the learned patterns. This means that the disturbance patterns have a higher energy than the learned ones.
In general, this kind of naive reasoning needs to be supported by more rigorous arguments based on probability theory. These indicate, for example, that also for the regime below αc The retrieved patterns are actually disturbance states, as soon as the number of embedded patterns is reached, p exceeds N.2lnN above. . Such disturbance states, however, have such a high degree of overlap with the learned patterns (0.97) that they basically agree with them.
This result and generalizations thereof for non-zero temperatures (i.e. noise in the dynamics) and beyond the critical capacity have been worked out in the following very technical document:
and in the book:
- Is physics better than chemistry or not
- Which careers in information technology are booming
- What are supercardioid microphones used for?
- Why New Zealand Celebrates Guy Fawkes
- How did people live 20,000 years
- How does gravity affect oxygen?
- Are gaming keyboards good for typing?
- Why do filmmakers change the source material
- What does an empty black flag mean
- Is Chanel a good makeup brand
- Is procrastination a habit
- Sooryavansham is the greatest Amitabh movie of all time
- Who is the best tackling punter kicker
- Why is Deathly Hallows split into two parts
- How good is NIT Meghalaya
- Why do people have vices
- Why do dartans always shout 180
- Which drugstore concealer has the best coverage
- Which is the most reliable bitcoin platform
- Is ammonium acetate basic or acidic
- What is 19 19 2
- Which services does Route2Market offer
- Is NYC super gay
- What are some common misconceptions about football
- Can teleport Harry Potter
- AutoCAD What is parametric software
- How important is dynamism in business
- What are some blooming calligraphy tips
- What do psychiatrists think of their patients
- Why do we keep butter in the fridge
- How was Quora before it got famous
- Is 16 GB RAM enough for Photoshop