By
Prof isaac galistein

Technology is good but we don’t understand it. We need a philosophy of tech, taking into account its psycho-socio-political implications. Tech made us human, in that it required an opposable thumb for manipulation, also freeing our forelimbs for erect walking. But I think its effect was deeper than this.

evolution of mind

Tech required us to concentrate on the task in hand, freeing us from instinctual drives. This required a repository for thoughts not of relevance. Hence, could tech have caused the split between conscious and unconscious mind? If so, then a specifically shortsighted conscious would have been the result, with the possibility that the more information we have, the smaller and more individualistic our conscious becomes.
History bears out this possibility, with moves against spirituality and community as tech information increased. If this process continues, the time could come when tech shrinks our mind so much that we are no longer human.
We can see this in moral behaviour. We seem to be far more benevolent to each other nowadays, but could this only be because tech provides services and order which cancel out our need to survive? It could be that we have simply sidelined morality instead of being moral.

info-tech

Modern info-tech holds problems. Information and surveillance has now come to the point that the private is disappearing. As this trend continues, we are increasingly monitored, and this is allowing authority unprecedented control over us. Tech is taking away our freedoms.
It is also taking away our ability to think. Computer tech is rigid, with tasks now following strict patterns taking away our ability to use our initiative. This is transferring to society, where rules are becoming increasingly fundamental. We are becoming cogs in a social machine. Forget The Terminator. The machine world is already here, and it’s subtle!
All these problems can be overcome if we realize that they exist. Do so and tech can be what it should be. Our servant, and not our master.