Long distance dependencies as direct relationships

1738022400000

Words are a superposition of Hidden Markov Chains, sharing the same surface. The characters are the surface. But the underlying structure, including long distance relationships are in fact direct relationship on different dimentionalities.