Spiders

**Year 1: Circadian rhythms in social spider
behavior**. (Brief Description) Ecologically, spiders are
both predators and prey. Typically, behavior which maximizes
capture success also will increase vulnerability to predation
(Riechert & Tracy 1975; Cloudsley-Thompson 1978). Thus, there
is selective pressure on spiders to limit their foraging activity
to when it is most effective and remain secluded in retreats at
other times. We know from casual observation that the social spider
*A. studiosus* is generally nocturnal, retreating deep in
the web during the day, and moving onto the capture sheet at night.
In this project, we will quantify daily rhythms in several
behaviors including locomotor activity patterns, spacing of
individuals within the web, aggressiveness toward prey, and
antipredator behavior. Observations of all these behaviors will be
conducted under LD 12:12 and constant darkness. Time-series
analyses will be employed in order to characterize the periodicity,
or lack thereof, in the data.

**Year 2: Neurochemical underpinnings of rhythmicity in
social spider behavior.** (Brief Description) Natural diel
rhythms of biogenic amines will be analyzed in
*A. studiosus*, and then correlated with the observed
behavioral rhythms determined from year 1. We will also manipulate
octopamine and serotonin levels in spiders (following Barron et al.
2007) to observe their effects on the specific behaviors listed
above, and on their rhythmicity.

**Year 3: Predicting and testing effects of rhythmicity in
spider behavior on prey capture within a cooperative web**.
(Case Study) In this final phase of research on the social spider
system, we will quantify the ecological implications of diel
rhythmicity in spider spacing within the communal web. The spacing
of cooperatively foraging spiders within the web can profoundly
affect their foraging success (Uetz & Hieber 1997). Whether or
not an insect is captured or escapes when it hits a web depends on
how quickly at least one spider detects it and reaches it.
Therefore, there should be an optimal spacing of spiders within a
web that maximizes the collective area of detection and capture. We
will explore this with a combination of lab and field experiments,
agent-based modeling, and logistic regression modeling.

The first phase of this project will be to establish ranges of
detection and speeds of response to prey items that hit the web.
Under standard LD 12:12 laboratory conditions, we will rear
*A. studiosus* colonies of 5 spiders in 15 cm petri dishes.
The spiders will be fed and watered ad libitum for four days, and
then only watered for another four days to control for hunger
state. Modifying the protocol of Pruitt et al. (2008), we will drop
a cricket (of known mass) into the web from a standard height at a
random place in the web and video the spiders’ responses. For
observations in the dark, red light and IR video will be used. We
will test 25 colonies in this manner at each of four phases of the
24-hour cycle (just prior to lights on, mid-photophase, just prior
to lights off, and mid-scotophase). By analyzing the video data, we
will be able to establish zones-of-detection and speeds-of-response
for the spiders (which will inform parameters for the agent-based
model). We will also be able to detect any rhythmicity in prey
capture behavior. To quantify the rate in which prey can escape the
web, we will allow spiders to build webs on a standard substrate,
and then remove them from the web. Then we will blow small and
large flying prey (Drosophila melanogaster and D. hydei) into the
web and video their escape behavior. This will allow us to estimate
how much time after prey hits the web the spiders have to subdue
it.

The next phase of the project will be to construct an agent-based
model to predict optimal spacing patterns in social spider
colonies. These models will be developed in the NetLogo programming
environment (Wilensky 1999), a powerful, yet user-friendly,
interface, specifically designed for agentbased modeling. We will
program ‘virtual webs’ in which simulated spiders can
respond to simulated prey hits (with parameters based on the lab
study above). With this system, we can test the effects of
different spacing patterns (i.e. clumped, random, or uniform) and
different densities of spiders on the likelihood of prey detection
and time to intercept. By incorporating prey escape behavior data,
we can actually predict the optimal capture area and spider spacing
that maximizes prey capture success, for a given number of spiders.
We will also approach this question using logistic regression
modeling.

In the final phase, we will quantify rhythmicity in prey capture under semi-natural conditions. Following the method of Jones and Parker (2002), we will force spiders to build their webs on artificial foliage, which will then be wired into natural foliage in the field. This method allows spiders to be exposed to natural levels of prey and predators, while maintaining the ability to census spiders in the web and quantify prey capture. We will monitor 25 colonies of 10 spiders every four hours for five days in June, and again in July and August. We will record the number and spacing of spiders present, as well as the number, size, and type (to Order) of prey captured. At these same times, we will independently quantify daily rhythms in predator/prey densities using sticky traps (Riechert & Jones 2008). These data will allow us to place patterns of activity, web-spacing, and foraging behavior in an ecological context. With respect to the observed activity patterns, a wealth of graph-theoretic modeling can be brought to bear. For example, we know that spiders place themselves in a shared web such that, regardless of where a prey may be captured in the web, there will be a spider located close enough so that one spider can quickly reach the prey and reinforcements are soon to follow. Such an optimal placement of spiders is reminiscent of many highly studied topics in graph theory. Simply being “equally spaced” is insufficient to guarantee that one and only one spider will be “the closest”. The domination number of a graph is the minimum number of vertices that can be selected (over all possible selections) with the property that all remaining vertices are connected to at least one of the selected vertices. If a graph has a specified structure, such a tree graph, then this number may be expressed as a function of the total number of vertices. In most instances, however, this number is not known. Using the design of the web as the underlying graph structure, we can begin to study the placement of the spiders in a mathematical theoretical setting. Variations on the domination number, such as the locating-dominating number, include added restrictions to the dominating set of vertices. A locating-dominating set of vertices has the added property that any vertex not in the dominating set must intersect the dominating set uniquely. Thus, if a prey hits the web resulting in a vibration across the web, then, by its unique pattern, the spiders will know the location of the prey.