What are the max angles of human eyeball rotation?

What are the max angles of human eyeball rotation?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

How much can our eyeballs rotate towards the nose, away from it, towards the top and bottom?

There shall be slight differences due to physiology. Nevertheless, the average vertical ascending angle is 25 degrees and descending angle, 30 degrees. Within the x-y plane of which we assume to be coplanar with the central line of vision, the maximal angle of rotation is 35 degrees in the left and right directions respectively.

I attempted to construct an image which depicts these angles - simply a Cartesian diagram. You can find further clarification of details from this link by Nelson & Associates

Finally, it is important to note that although the eye may be capable of these rotations, the eye may not be able focus on all objects in this field acutely, namely, focusing light on the fovea.

Happy muscular experiments.


Alignment of the body to the gravitational vertical is considered to be the key to human bipedalism. However, changes to the semicircular canals during human evolution 1, 2, 3 suggest that the sense of head rotation that they provide is important for modern human bipedal locomotion. When walking, the canals signal a mix of head rotations associated with path turns, balance perturbations, and other body movements. It is uncertain how the brain uses this information. Here, we show dual roles for the semicircular canals in balance control and navigation control. We electrically evoke a head-fixed virtual rotation signal from semicircular canal nerves 4, 5, 6 as subjects walk in the dark with their head held in different orientations. Depending on head orientation, we can either steer walking by “remote control” or produce balance disturbances. This shows that the brain resolves the canal signal according to head posture into Earth-referenced orthogonal components and uses rotations in vertical planes to control balance and rotations in the horizontal plane to navigate. Because the semicircular canals are concerned with movement rather than detecting vertical alignment, this result shows the importance of movement control and agility rather than precise vertical alignment of the body for human bipedalism.


Our eyes are able to look around a scene and dynamically adjust based on subject matter, whereas cameras capture a single still image. This trait accounts for many of our commonly understood advantages over cameras. For example, our eyes can compensate as we focus on regions of varying brightness, can look around to encompass a broader angle of view, or can alternately focus on objects at a variety of distances.

However, the end result is akin to a video camera — not a stills camera — that compiles relevant snapshots to form a mental image. A quick glance by our eyes might be a fairer comparison, but ultimately the uniqueness of our visual system is unavoidable because:

What we really see is our mind's reconstruction of objects based on input provided by the eyes — not the actual light received by our eyes.

Skeptical? Most are — at least initially. The examples below show situations where one's mind can be tricked into seeing something different than one's eyes:

False Color: Move your mouse onto the corner of the image and stare at the central cross. The missing dot will rotate around the circle, but after a while this dot will appear to be green — even though no green is actually present in the image.

Mach Bands: Move your mouse on and off of the image. Each of the bands will appear slightly darker or lighter near its upper and lower edges — even though each is uniformly gray.

However, this shouldn't discourage us from comparing our eyes and cameras! Under many conditions a fair comparison is still possible, but only if we take into consideration both what we're seeing and how our mind processes this information. Subsequent sections will try to distinguish the two whenever possible.

The eyelids

It is vitally important that the front surface of the eyeball, the cornea, remain moist. This is achieved by the eyelids, which during waking hours sweep the secretions of the lacrimal apparatus and other glands over the surface at regular intervals and which during sleep cover the eyes and prevent evaporation. The lids have the additional function of preventing injuries from foreign bodies, through the operation of the blink reflex. The lids are essentially folds of tissue covering the front of the orbit and, when the eye is open, leaving an almond-shaped aperture. The points of the almond are called canthi that nearest the nose is the inner canthus, and the other is the outer canthus. The lid may be divided into four layers: (1) the skin, containing glands that open onto the surface of the lid margin, and the eyelashes (2) a muscular layer containing principally the orbicularis oculi muscle, responsible for lid closure (3) a fibrous layer that gives the lid its mechanical stability, its principal portions being the tarsal plates, which border directly upon the opening between the lids, called the palpebral aperture and (4) the innermost layer of the lid, a portion of the conjunctiva. The conjunctiva is a mucous membrane that serves to attach the eyeball to the orbit and lids but permits a considerable degree of rotation of the eyeball in the orbit.

Suggested BVH Joint Rotation Limits

This information is useful for preventing animators from hyperextending joints. It's also required by some programs to set up inverse kinematics.

Forward kinematics is animating by rotating the joints manually. Inverse kinematics is animating by moving the end of a chain of bones (often the hands and feet) causing the bones behind it to be pulled along like links in a chain. Most animators use a combination of the two. Inverse kinematics is good for quickly getting the bones into a position close to what you desire. You can then use forward kinematics to fine tune the positions before setting your keyframe. Inverse kinematics can also be useful for quickly placing a hand, elbow, knee, or foot exactly where you want it.

None of the limits above are 'official'. They aren't due to limitations in the simulator or viewer. They aren't based on anything in the built-in animations. They aren't drawn from medical sources. In fact, I was surprised to be unable to find anything like this on the web, despite it being important to anyone making animations for humanoids. I can only say that the animations I've created with these limits look right to my eyes, and I can make most of these bends with my own body (and I'm not double jointed or a gymnast). Information from more reputable sources would be welcome.

When trying to estimate what the joint rotation limits should be it's important to consider actions that bend the joints to extremes.

Actions I considered include:

  • touch ear to shoulder
  • cross elbows in front of the chest
  • cross elbows behind the head
  • touch elbows behind the back
  • torso tilt to the side (exercise)
  • torso twist (exercise)
  • crossing legs at the knees
  • crossing legs by resting an ankle on the opposite knee
  • doing the splits
  • cartwheel
  • swan dive
  • baseball pitch
  • swinging from a gymnastics high bar (e.g. with your arms behind you)
  • walking on the tips of the toes, as in ballet
  • toe touch
  • hand walking
  • several yoga poses:
    • lotus
    • both feet behind the head
    • bound angle
    • bow
    • crocodile
    • eagle
    • extended side angle
    • gate
    • half spinal twist
    • hero
    • monkey
    • scorpion
    • standing forward bend with hands clasped behind back
    • standing side bend
    • upward bow

    Several of the limits need some explanation.

    The Shldr forward Y rotation limit should normally be 105, but this would make the scorpion yoga pose, and similar positions, impossible. The Y axis is normally used to rotate the arms forward and backward along a approximately horizontal plane, but in positions similar to the scorpion yoga pose it describes rotation along approximately vertical planes, which is what the X axis would normally describe. To place the elbow beside the head either the X or Y axis limit must be 180. If the X axis limit is raised to 180 the Y axis limit must also be raised to 180 to allow the elbow to rest at the side (e.g. to smoothly transition into its position by the head). If the Y axis limit is raised then only that axis needs excessive freedom. The Shldr backward Y rotation limit does not need to be raised, even in positions like the scorpion yoga pose.

    Hands should not normally rotate along the X axis, but some X axis rotation is necessary to compensate for the lack of finger control.

    Since hip is the root of the bone hierarchy, it rotates the entire body. It therefore has no rotation limits or stiffness.

    Joint stiffness controls how easily a joint rotates around a particular axis. When configured properly it causes chains of bones to move realistically when using inverse kinematics.

    Most of the time, the greater the range of an axis, the less stiff it should be. For that reason I used the following formula to calculate the stiffness for most axes:

    "abs" stands for absolute value
    "start" and "end" are variables containing the minimum and maximum values for rotation along that axis

    You calculate the percentage of a full 360 degree circle that the rotation limits allow. Since this percentage will increase as rotation range increases, and we want one that decreases as rotation range increases, we subtract it from 1.

    There are times when you don't want to use this method animation tools would not make it configurable otherwise.

    I used 105, instead of 180, as the Shldr forward Y axis rotation limit when calculating the stiffness. This best represents the usual behavior of the Y axis and should result in the most natural movement.

    Since Hand X axis rotation should be minimized, I listed their stiffness as 99%.

    218 Limits of Resolution: The Rayleigh Criterion

    Light diffracts as it moves through space, bending around obstacles, interfering constructively and destructively. While this can be used as a spectroscopic tool—a diffraction grating disperses light according to wavelength, for example, and is used to produce spectra—diffraction also limits the detail we can obtain in images. (Figure)(a) shows the effect of passing light through a small circular aperture. Instead of a bright spot with sharp edges, a spot with a fuzzy edge surrounded by circles of light is obtained. This pattern is caused by diffraction similar to that produced by a single slit. Light from different parts of the circular aperture interferes constructively and destructively. The effect is most noticeable when the aperture is small, but the effect is there for large apertures, too.

    How does diffraction affect the detail that can be observed when light passes through an aperture? (Figure)(b) shows the diffraction pattern produced by two point light sources that are close to one another. The pattern is similar to that for a single point source, and it is just barely possible to tell that there are two light sources rather than one. If they were closer together, as in (Figure)(c), we could not distinguish them, thus limiting the detail or resolution we can obtain. This limit is an inescapable consequence of the wave nature of light.

    There are many situations in which diffraction limits the resolution. The acuity of our vision is limited because light passes through the pupil, the circular aperture of our eye. Be aware that the diffraction-like spreading of light is due to the limited diameter of a light beam, not the interaction with an aperture. Thus light passing through a lens with a diameter shows this effect and spreads, blurring the image, just as light passing through an aperture of diameter does. So diffraction limits the resolution of any system having a lens or mirror. Telescopes are also limited by diffraction, because of the finite diameter of their primary mirror.

    Draw two lines on a white sheet of paper (several mm apart). How far away can you be and still distinguish the two lines? What does this tell you about the size of the eye’s pupil? Can you be quantitative? (The size of an adult’s pupil is discussed in Physics of the Eye.)

    Just what is the limit? To answer that question, consider the diffraction pattern for a circular aperture, which has a central maximum that is wider and brighter than the maxima surrounding it (similar to a slit) [see (Figure)(a)]. It can be shown that, for a circular aperture of diameter , the first minimum in the diffraction pattern occurs at (providing the aperture is large compared with the wavelength of light, which is the case for most optical instruments). The accepted criterion for determining the diffraction limit to resolution based on this angle was developed by Lord Rayleigh in the 19th century. The Rayleigh criterion for the diffraction limit to resolution states that two images are just resolvable when the center of the diffraction pattern of one is directly over the first minimum of the diffraction pattern of the other. See (Figure)(b). The first minimum is at an angle of , so that two point objects are just resolvable if they are separated by the angle

    where is the wavelength of light (or other electromagnetic radiation) and is the diameter of the aperture, lens, mirror, etc., with which the two objects are observed. In this expression, has units of radians.

    All attempts to observe the size and shape of objects are limited by the wavelength of the probe. Even the small wavelength of light prohibits exact precision. When extremely small wavelength probes as with an electron microscope are used, the system is disturbed, still limiting our knowledge, much as making an electrical measurement alters a circuit. Heisenberg’s uncertainty principle asserts that this limit is fundamental and inescapable, as we shall see in quantum mechanics.

    The primary mirror of the orbiting Hubble Space Telescope has a diameter of 2.40 m. Being in orbit, this telescope avoids the degrading effects of atmospheric distortion on its resolution. (a) What is the angle between two just-resolvable point light sources (perhaps two stars)? Assume an average light wavelength of 550 nm. (b) If these two stars are at the 2 million light year distance of the Andromeda galaxy, how close together can they be and still be resolved? (A light year, or ly, is the distance light travels in 1 year.)

    The Rayleigh criterion stated in the equation gives the smallest possible angle between point sources, or the best obtainable resolution. Once this angle is found, the distance between stars can be calculated, since we are given how far away they are.

    Solution for (a)

    The Rayleigh criterion for the minimum resolvable angle is

    Entering known values gives

    The distance between two objects a distance away and separated by an angle is .

    Substituting known values gives

    The angle found in part (a) is extraordinarily small (less than 1/50,000 of a degree), because the primary mirror is so large compared with the wavelength of light. As noticed, diffraction effects are most noticeable when light interacts with objects having sizes on the order of the wavelength of light. However, the effect is still there, and there is a diffraction limit to what is observable. The actual resolution of the Hubble Telescope is not quite as good as that found here. As with all instruments, there are other effects, such as non-uniformities in mirrors or aberrations in lenses that further limit resolution. However, (Figure) gives an indication of the extent of the detail observable with the Hubble because of its size and quality and especially because it is above the Earth’s atmosphere.

    The answer in part (b) indicates that two stars separated by about half a light year can be resolved. The average distance between stars in a galaxy is on the order of 5 light years in the outer parts and about 1 light year near the galactic center. Therefore, the Hubble can resolve most of the individual stars in Andromeda galaxy, even though it lies at such a huge distance that its light takes 2 million years for its light to reach us. (Figure) shows another mirror used to observe radio waves from outer space.

    A 305-m-diameter natural bowl at Arecibo in Puerto Rico is lined with reflective material, making it into a radio telescope. It is the largest curved focusing dish in the world. Although for Arecibo is much larger than for the Hubble Telescope, it detects much longer wavelength radiation and its diffraction limit is significantly poorer than Hubble’s. Arecibo is still very useful, because important information is carried by radio waves that is not carried by visible light. (credit: Tatyana Temirbulatova, Flickr)

    Diffraction is not only a problem for optical instruments but also for the electromagnetic radiation itself. Any beam of light having a finite diameter and a wavelength exhibits diffraction spreading. The beam spreads out with an angle given by the equation . Take, for example, a laser beam made of rays as parallel as possible (angles between rays as close to as possible) instead spreads out at an angle , where is the diameter of the beam and is its wavelength. This spreading is impossible to observe for a flashlight, because its beam is not very parallel to start with. However, for long-distance transmission of laser beams or microwave signals, diffraction spreading can be significant (see (Figure)). To avoid this, we can increase . This is done for laser light sent to the Moon to measure its distance from the Earth. The laser beam is expanded through a telescope to make much larger and smaller.

    The beam produced by this microwave transmission antenna will spread out at a minimum angle due to diffraction. It is impossible to produce a near-parallel beam, because the beam has a limited diameter.

    In most biology laboratories, resolution is presented when the use of the microscope is introduced. The ability of a lens to produce sharp images of two closely spaced point objects is called resolution. The smaller the distance by which two objects can be separated and still be seen as distinct, the greater the resolution. The resolving power of a lens is defined as that distance . An expression for resolving power is obtained from the Rayleigh criterion. In (Figure)(a) we have two point objects separated by a distance . According to the Rayleigh criterion, resolution is possible when the minimum angular separation is

    where is the distance between the specimen and the objective lens, and we have used the small angle approximation (i.e., we have assumed that is much smaller than ), so that .

    Therefore, the resolving power is

    Another way to look at this is by re-examining the concept of Numerical Aperture () discussed in Microscopes. There, is a measure of the maximum acceptance angle at which the fiber will take light and still contain it within the fiber. (Figure)(b) shows a lens and an object at point P. The here is a measure of the ability of the lens to gather light and resolve fine detail. The angle subtended by the lens at its focus is defined to be . From the figure and again using the small angle approximation, we can write

    The for a lens is , where is the index of refraction of the medium between the objective lens and the object at point P.

    From this definition for , we can see that

    In a microscope, is important because it relates to the resolving power of a lens. A lens with a large will be able to resolve finer details. Lenses with larger will also be able to collect more light and so give a brighter image. Another way to describe this situation is that the larger the , the larger the cone of light that can be brought into the lens, and so more of the diffraction modes will be collected. Thus the microscope has more information to form a clear image, and so its resolving power will be higher.

    (a) Two points separated by at distance and a positioned a distance away from the objective. (credit: Infopro, Wikimedia Commons) (b) Terms and symbols used in discussion of resolving power for a lens and an object at point P. (credit: Infopro, Wikimedia Commons)

    One of the consequences of diffraction is that the focal point of a beam has a finite width and intensity distribution. Consider focusing when only considering geometric optics, shown in (Figure)(a). The focal point is infinitely small with a huge intensity and the capacity to incinerate most samples irrespective of the of the objective lens. For wave optics, due to diffraction, the focal point spreads to become a focal spot (see (Figure)(b)) with the size of the spot decreasing with increasing . Consequently, the intensity in the focal spot increases with increasing . The higher the , the greater the chances of photodegrading the specimen. However, the spot never becomes a true point.

    Section Summary

    • Diffraction limits resolution.
    • For a circular aperture, lens, or mirror, the Rayleigh criterion states that two images are just resolvable when the center of the diffraction pattern of one is directly over the first minimum of the diffraction pattern of the other.
    • This occurs for two point objects separated by the angle , where is the wavelength of light (or other electromagnetic radiation) and is the diameter of the aperture, lens, mirror, etc. This equation also gives the angular spreading of a source of light having a diameter .

    Conceptual Questions

    A beam of light always spreads out. Why can a beam not be created with parallel rays to prevent spreading? Why can lenses, mirrors, or apertures not be used to correct the spreading?

    Problems & Exercises

    The 300 x 10^2-m-diameter Arecibo radio telescope pictured in (Figure) detects radio waves with a 4.00 cm average wavelength.

    (a) What is the angle between two just-resolvable point sources for this telescope?

    (b) How close together could these point sources be at the 2 million light year distance of the Andromeda galaxy?


    Assuming the angular resolution found for the Hubble Telescope in (Figure), what is the smallest detail that could be observed on the Moon?

    Diffraction spreading for a flashlight is insignificant compared with other limitations in its optics, such as spherical aberrations in its mirror. To show this, calculate the minimum angular spreading of a flashlight beam that is originally 5.00 cm in diameter with an average wavelength of 600 nm.

    (a) What is the minimum angular spread of a 633-nm wavelength He-Ne laser beam that is originally 1.00 mm in diameter?

    (b) If this laser is aimed at a mountain cliff 15.0 km away, how big will the illuminated spot be?

    (c) How big a spot would be illuminated on the Moon, neglecting atmospheric effects? (This might be done to hit a corner reflector to measure the round-trip time and, hence, distance.) Explicitly show how you follow the steps in Problem-Solving Strategies for Wave Optics.

    A telescope can be used to enlarge the diameter of a laser beam and limit diffraction spreading. The laser beam is sent through the telescope in opposite the normal direction and can then be projected onto a satellite or the Moon.

    (a) If this is done with the Mount Wilson telescope, producing a 2.54-m-diameter beam of 633-nm light, what is the minimum angular spread of the beam?

    (b) Neglecting atmospheric effects, what is the size of the spot this beam would make on the Moon, assuming a lunar distance of ?


    (b) Diameter of

    The limit to the eye’s acuity is actually related to diffraction by the pupil.

    (a) What is the angle between two just-resolvable points of light for a 3.00-mm-diameter pupil, assuming an average wavelength of 550 nm?

    (b) Take your result to be the practical limit for the eye. What is the greatest possible distance a car can be from you if you can resolve its two headlights, given they are 1.30 m apart?

    (c) What is the distance between two just-resolvable points held at an arm’s length (0.800 m) from your eye?

    (d) How does your answer to (c) compare to details you normally observe in everyday circumstances?

    What is the minimum diameter mirror on a telescope that would allow you to see details as small as 5.00 km on the Moon some 384,000 km away? Assume an average wavelength of 550 nm for the light received.

    You are told not to shoot until you see the whites of their eyes. If the eyes are separated by 6.5 cm and the diameter of your pupil is 5.0 mm, at what distance can you resolve the two eyes using light of wavelength 555 nm?

    (a) The planet Pluto and its Moon Charon are separated by 19,600 km. Neglecting atmospheric effects, should the 5.08-m-diameter Mount Palomar telescope be able to resolve these bodies when they are from Earth? Assume an average wavelength of 550 nm.

    (b) In actuality, it is just barely possible to discern that Pluto and Charon are separate bodies using an Earth-based telescope. What are the reasons for this?

    (a) Yes. Should easily be able to discern.

    (b) The fact that it is just barely possible to discern that these are separate bodies indicates the severity of atmospheric aberrations.

    The headlights of a car are 1.3 m apart. What is the maximum distance at which the eye can resolve these two headlights? Take the pupil diameter to be 0.40 cm.

    When dots are placed on a page from a laser printer, they must be close enough so that you do not see the individual dots of ink. To do this, the separation of the dots must be less than Raleigh’s criterion. Take the pupil of the eye to be 3.0 mm and the distance from the paper to the eye of 35 cm find the minimum separation of two dots such that they cannot be resolved. How many dots per inch (dpi) does this correspond to?

    An amateur astronomer wants to build a telescope with a diffraction limit that will allow him to see if there are people on the moons of Jupiter.

    (a) What diameter mirror is needed to be able to see 1.00 m detail on a Jovian Moon at a distance of from Earth? The wavelength of light averages 600 nm.

    (b) What is unreasonable about this result?

    (c) Which assumptions are unreasonable or inconsistent?

    Construct Your Own Problem

    Consider diffraction limits for an electromagnetic wave interacting with a circular object. Construct a problem in which you calculate the limit of angular resolution with a device, using this circular object (such as a lens, mirror, or antenna) to make observations. Also calculate the limit to spatial resolution (such as the size of features observable on the Moon) for observations at a specific distance from the device. Among the things to be considered are the wavelength of electromagnetic radiation used, the size of the circular object, and the distance to the system or phenomenon being observed.

    Unconscious Essential Reads

    How Conscious and Unconscious Bias Challenge Racism

    Bridging Poetry and the Subconscious With Visual Images

    Grossmann concluded, "Their brains clearly responded to social cues conveyed through the eyes, indicating that even without conscious awareness, human infants are able to detect subtle social cues. The existence of such brain mechanisms in infants likely provides a vital foundation for the development of social interactive skills in humans."

    The Cooperative Eye Hypothesis

    Maintaining eye contact when interacting with another person is probably the most important rule of social engagement. Eye contact allows you to see into the window of another person’s soul to some degree and it builds trust.

    We often create a conscious narrative based on the subconscious social cues that our mind is picking up by reading the saccadic rhythms, rapid eye movements, and the amount and angle of sclera exposed when communicating nonverbally with one another.

    The cooperative eye hypothesis suggests that the eye's distinctive visible characteristics evolved to make it easier for humans to follow another's gaze while communicating or while working together on tasks.

    The cooperative eye hypothesis was first proposed by H. Kobayashi and S. Khoshima in 2002 and was later tested by Michael Tomasello and others at the Max Planck Institute for Evolutionary Anthropology in Germany.

    Interestingly, animal researchers have also found that, in the course of their domestication, dogs have also developed the ability to pick up visual cues from the sclera of humans.

    What Are “Saccades?”

    Saccades are the very quick, simultaneous movements made by the eye to receive visual information and shift the line of vision from one position to another. As visual information is received from the retina it is translated into spatial information and then transferred to motor centers for appropriate motor responses.

    We rely on the accuracy of these movements every millisecond of our lives. During normal day-to-day conditions, you make about 3-5 saccades per second which amounts to about a half-million saccades a day.

    Monitoring the speed of saccadic movements is an excellent way to objectively measure someone's level of fatigue. Recently, scientists in Europe began using a new Google glass type of device to monitor the level of fatigue in physicians working overtime by tracking their rapid eye movements.

    In terms of the saccades of someone who seems hyper-alert. I was amused to watch the precise rapid eye movements of Taylor Swift as she clapped a buzzing fly between her hands during a live interview promoting her album 1989.

    Someone with saccadic dysmetria produces uncontrollable eye movements including microsaccades, ocular flutter, and square wave jerks even when the eye is at rest. The cause of dysmetria is thought to be lesions in the cerebellum or lesions in the proprioceptive nerves that lead to the cerebellum. Your cerebellum is responsible for coordination of visual, spatial, and other sensory information with motor control.

    The cerebellum is also essential for the automatic motor learning of the vestibulo-occular reflex (VOR) and is responsible for ensuring accurate eye movements in conjunction with head movements. Implicit motor learning in the VOR is in many ways analogous to classical eyeblink conditioning. The circuits of both are structured similarly and the molecular mechanisms work the same way.

    Your Sclera, Saccades, and Cerebellum Are Intertwined

    Most of us take the interpretation and projection of scleral social cues for granted since they become innate early in our childhood development. But for children with autism spectrum disorder (ASD) the ability to make eye contact or interpret the social cues held in the eyes and eye whites doesn’t come naturally.

    Fortunately, neuroscientists around the globe are making huge advances towards understanding why people with ASD struggle to interact with others and the world around them.

    Samuel Wang, Associate Professor of Molecular Biology at Princeton University, is doing fascinating research on information processing in the cerebellum, including: its contributions to motor learning, the cerebellar roles in cognitive and affective function, and autism spectrum disorder.

    Wang and his colleagues at Princeton recently discovered that early cerebellum malfunction hinders neural development and could be a possible root of autism. In August 2014, they published this new theory in the journal Neuron.

    Conclusion: The Social Cues Conveyed by Eye Whites Can Strengthen Social Connections

    Healthy social and cognitive development relies on the ability of your brain to consciously and unconsciously interpret social cues held in the eye whites of others.

    For children with autism, the challenges of trying to explicitly learn how to interpret thousands of saccades an hour and the social cues conveyed through eye whites are astronomical. This new research offers more insights as to why it can be so challenging for people with ASD to interact and connect with others and the environments around them.

    Hopefully, these findings will lead to more research and possible interventions that will build stronger and healthier social bonds between people from all walks of life.

    If you'd like to read more on related topics please check out my Psychology Today blog posts:

    Follow me on Twitter @ckbergland for updates on The Athlete’s Way blog posts.

    Anatomy of the Eye

    The sclera is outermost layer of the eyeball. It is the white (and opaque) part of the eyeball. Muscles responsible for moving the eyeball are attached to the eyeball at the sclera.

    At the front of the eyeball, the sclera becomes the cornea. The cornea is the transparent dome-shaped part of the eyeball. Light rays from the outside world first pass through the cornea before reaching the lens. Together with the lens, the cornea is responsible focussing light on the retina.

    The choroid is the middle layer of the eyeball located between the sclera and the retina. It provides nutrients and oxygen to the outer surface of the retina.

    Anterior Chamber

    The space between the cornea and the lens is known as the anterior chamber. It is filled with fluid called aqueous humour. The anterior chamber is also known as anterior cavity.

    Aqueous humour

    The Aqueous humour is a transparent watery fluid that circulates in the anterior chamber. It provides oxygen and nutrients to the inner eye and exerts fluid pressure that helps maintain the shape of the eye. The aqueous humour is produced by the ciliary body.

    Posterior Chamber

    The posterior chamber is a larger area than the anterior chamber. It is located opposite to the anterior chamber at the back of the lens. It is filled with a fluid called vitreous humour. The posterior Chamber is also referred to as the Vitreous body as indicated in the diagram below - anatomy of the eye.

    Anatomy of the eye: cross section of the human eyeball viewed from above

    ꧚ve Carlson /

    Vitreous humour

    The vitreous humour is a transparent jelly-like fluid that fills the posterior chamber. It exerts fluid pressure that keeps the retina layers pressed together to maintain the shape of the eye and to maintain sharp focus of images on the retina.

    The choroid continues at the front of the eyeball to form the Iris. The iris is a flat, thin, ring-shaped structure sticking in to the anterior chamber. This is the part that identifies a person’s eye colour. The iris contains circular muscles which go around the pupil and radial muscles that radiate toward the pupil. When the circular muscles contract they make the pupil smaller, when the radial muscles contract, they makes the pupil wider.

    Ciliary muscles

    The cilliary muscles are located inside the ciliary body. These are the muscles that continuously change the shape of the lens for near and distant vision. See diagram anatomy of the eye above.

    Ciliary Body

    The choroid continues at the front of the eyeball to form the ciliary body. It produces the aqueous humour. The ciliary body also contains the ciliary muscles that contract or relax to change the shape of the lens.

    The zonule also known as suspensory ligaments is a ring of small fibres that hold the lens suspended in place. It connects the lens to the ciliary body and allows the lens to change shape.

    The lens is a biconvex transparent disc made of proteins called crystallines. It is located directly behind the iris and focuses light on to the retina. In humans, the lens changes shape for near and for distant vision.

    Human Eye Anatomy: cross section of the human eyeball viewed from the side

    che via Wikimedia Commons

    The pupil is the hole at the center of the iris located in front of the lens. Whenever more light needs to enter the eyeball, the muscles in the iris contract like the diaphragm of a camera to increase or decrease the size of the pupil.

    The retina is the innermost layer lining the back of the eyeball. It is the light sensitive part of the eye. The retina contains photo receptors that detect light. These photo receptors are known as cones and rods. Cones enable us to detect color while rods enable us to see in poor light. The retina contains nerve cells that transmit signals from the retina to the brain.

    The fovea is a small depression in the retina near the optic disc. The fovea has a high concentration of cones. It is the part of the retina where visual acuity is greatest.

    Optic nerve

    The optic nerve is located at the back of the eyeball. It contains the axons of retina ganglion cell (nerve cells of the retina) and it transmits impulses from the retina to the brain.

    Impulses are transmitted to the brain from the back of the eyeball at the optic disc also called the blind spot. It is called the blind spot because it contains no photoreceptors, hence any light that falls on it will not be detected.

    Eye muscles

    Muscles of the eye are very strong and efficient, they work together to move the eyeball in many different directions. The main muscles of the eye are Lateral rectus, Medial rectus, Superior rectus and inferior rectus.

    Central Artery and Vein

    The central artery and vein runs through the center of the optic nerve. The central artery supplies the retina while the central vein drains the retina. In the diagram above - anatomy of the eye, the artery is shown in red while the vein is shown in blue.


    The angular resolution or spatial resolution of an optical system can be estimated by Rayleigh's Criterion.
    When two point sources are resolved from each other, they are separated by at least the radius of the airy disk. When Θ = 1.22 (λ/D) rad ,
    where Θ is the angular resolution, λ is the wavelength of light and D is the diameter of the eye. Remember that 360 degrees = 2π radians.

    The eye pupil diameter changes during day and night, whereas the day the pupil size is between 3 mm to 4 mm and at night it is from 5 mm to 9 mm. In addition, the optimal sensitivity of the human eye is approximately 0.55 μm (V-band). So according to Rayleigh's Criterion, we can calculate the spatial resolution of human eye. Lets say that at day time the pupil size is 3 mm and the optimal sensitivity is 0.55 μm, we can apply the rule.

    Θ = 1.22(λ/D)rad =1.22(0.55 μm/3 mm)rad(180 deg/π rad)(1 mm/10 3 μm) = 0.0128 deg(3600”/1 deg) = 50” (day)

    Moreover, at night the pupil diameter increases to 9 mm to increase the observation, we can do the same calculation to find the angular resolution of the eye at night.


    Barsalou, L.W. (2008). Grounded cognition. Annual Review of Psychology, 59(1), 617–645.

    Berndt, D.J., & Clifford, J. (1994). Using dynamic time warping to find patterns in time series. In KDD workshop, (Vol. 10, pp. 359–370).

    Bonatz, A.E., Steiner, H., Huston, J.P. (1987). Video image analysis of behavior by microcomputer: categorization of turning and locomotion after 6-ohda injection into the substantia nigra. Journal of Neuroscience Methods, 22(1), 13–26.

    Bracha, H.S. (1989). Is there a right hemi-hyper-dopaminergic psychosis Schizophrenia Research, 2(4), 317–324.

    Bracha, H.S., Livingston, R.L., Clothier, J., Linington, B.B., Karson, C.N. (1993). Correlation of severity of psychiatric patients delusions with right hemispatial inattention (left-turning behavior). American Journal of Psychiatry, 150, 330–332.

    Bracha, H.S., Seitz, D.J., Otemaa, J., Glick, S.D. (1987). Rotational movement (circling) in normal humans: sex difference and relationship to hand, foot and eye preference. Brain Research, 411(2), 231–235.

    Dunnett, S. B., & Torres, E. M. (2012). Rotation in the 6-ohda-lesioned rat. In Animal models of movement disorders, (Vol. 61, pp. 299–315).

    Foxlin, E. (2005). Pedestrian tracking with shoe-mounted inertial sensors. IEEE Computer Graphics and Applications, 25(6), 38–46.

    Glick, S., Jerussi, T., Fleisher, L. (1976). Turning in circles: the neuropharmacology of rotation. Life Sciences, 18(9), 889–896.

    Glick, S.D., & Ross, D.A. (1981). Lateralization of function in the rat brain: basic mechanisms may be operative in humans. Trends in Neurosciences, (Vol. 4, pp. 196–199).

    Greenstein, S., & Glick, S. (1975). Improved automated apparatus for recording rotation (circling behavior) in rats or mice. Pharmacology Biochemistry and Behavior, 3(3), 507–510.

    Heredia-Lopez, F., Bata-Garcia, J., Alvarez-Cervera, F., Gongora-Alfaro, J. (2002). A novel rotometer based on a RISC microcontroller. Behavior Research Methods. Instruments, & Computers, 34(3), 399–407.

    Ishibashi, S., Kuroiwa, T., Katsumata, N., Yuan, S., Endo, S., Mizusawa, H. (2004). Extrapyramidal motor symptoms versus striatal infarction volume after focal ischemia in Mongolian gerbils. Neuroscience, 127(2), 269?275.

    Leuenberger, K., & Gassert, R. (2011). Low-power sensor module for long-term activity monitoring. In Engineering in medicine and biology society, embc, 2011 annual international conference of, pages 2237–2241, .

    Leuenberger, K., Gonzenbach, R., Wiedmer, E., Luft, A., Gassert, R. (2014). Classification of stair ascent and descent in stroke patients. In Wearable and implantable body sensor networks (BSN), 2014 eleventh international conference on (p. In press.)

    Madgwick, S. O., Harrison, A. J., Vaidyanathan, R. (2011). Estimation of IMU and MARG orientation using a gradient descent algorithm. In International Conference on Rehabilitation robotics (ICORR), 2011 IEEE, (pp. 1–7).

    Mohr, C., Bracha, H. S., Brugger, P. (2003). Magical ideation modulates spatial behavior. The Journal of Neuropsychiatry and Clinical Neurosciences, 15(2), 168–174.

    Mohr, C., Landis, T., Bracha, H., Brugger, P., et al. (2003). Opposite turning behavior in right-handers and non-right-handers suggests a link between handedness and cerebral dopamine asymmetries. Behavioral Neuroscience, 117(6), 1448–1452.

    Mohr, C., & Lievesley, A. (2007). Test?retest stability of an experimental measure of human turning behaviour in right-handers, mixed-handers, and left-handers. Laterality, 12(2), 172–190.

    Moncada-Torres, A., Leuenberger, K., Gonzenbach, R., Luft, A., Gassert, R. (2014). Activity classification based on inertial and barometric pressure sensors at different anatomical locations. Physiological Measurement, 35(7), 1245–1263.

    Ozyagcilar, T. (2012). Calibrating an eCompass in the presence of hard and soft-iron interference.

    Patino, P., Garcia-Munoz, M., Freed, C.R. (1995). Electrophysiology of ventromedial striatal neurons during movement. Brain Research Bulletin, 37(5), 481–486.

    Robinson, T.E., Becker, J.B., Ramirez, V. (1980). Sex differences in amphetamine-elicited rotational behavior and the lateralization of striatal dopamine in rats. Brain Research Bulletin, 5(5), 539–545.

    Rogers, L.J. (2002). Lateralization in vertebrates: its early evolution, general pattern, and development. In: Peter, C. T. S., Slater Jay, J. B., Rosenblatt, S., Roper, T. J. (Eds.), (Vol. 31. Academic Press, pp. 107–161).

    Schaeffer, A. (1928). Spiral movement in man. Journal of Morphology, 45(1), 293–398.

    Schwarting, R., Goldenberg, R., Steiner, H., Fornaguera, J., Huston, J. (1993). A video image analyzing system for open-field behavior in the rat focusing on behavioral asymmetries. Journal of Neuroscience Methods, 49(3), 199–210.

    Souman, J. L., Frissen, I., Sreenivasa, M. N., Ernst, M. O. (2009). Walking straight into circles. Current Biology, 19(18), 1538–1542.

    Syed, Z., Aggarwal, P., Goodall, C., Niu, X., El-Sheimy, N. (2007). A new multi-position calibration method for MEMS inertial navigation systems. Measurement Science and Technology, 18(7), 1897.

    Titterton, D., & Weston, J. (2004). Strapdown inertial navigation technology, 2nd edition. The Institution of Engineering and Technology, London, United Kingdom and The American Institute of Aeronautics, Reston, Virginia, USA.

    Turton, A.J., Dewar, S.J., Lievesley, A., O? Leary, K., Gabb, J., Gilchrist, I.D. (2009). Walking and wheelchair navigation in patients with left visual neglect. Neuropsychological Rehabilitation, 19(2), 274–290.

    Ungerstedt, U., & Arbuthnott, G.W. (1970). Quantitative recording of rotational behavior in rats after 6-hydroxy-dopamine lesions of the nigrostriatal dopamine system. Brain Research, 24(3), 485–493.

    Vallortigara, G., & Bisazza, A. (2002). How ancient is brain lateralization. Comparative vertebrate lateralization.

    Eyes to See

    The great diversity of eye designs is not a product of evolution but rather the result of an all-seeing Creator designing the most appropriate eye for every situation and occasion.

    Any engineer who has ever worked on imaging instruments will tell you that the systems do not appear by chance. Yet nothing that engineers have produced begins to compare with what God has designed.4

    When you consider the amazing design of eyes in creation and then consider how eyes grow in the womb, or when you consider how the eye can repair and maintain itself for a lifetime, you have to agree with Solomon that there is only one option: the existence of a Creator who is perfect in knowledge and skill.

    " The hearing ear and the seeing eye, the Lord has made them both " (Proverbs 20:12).