National usgs marine Geohazards Workshop Menlo Park March 1, 2011 Notes by Amy Draut Tom Brocher, welcoming remarks



Download 157.74 Kb.
Page1/5
Date18.10.2016
Size157.74 Kb.
#2355
  1   2   3   4   5
National USGS Marine Geohazards Workshop

Menlo Park

March 1, 2011
Notes by Amy Draut
Tom Brocher, welcoming remarks:

Recent examples of importance of marine geohazards to society. Estimating slip rates on offshore faults using new technology, characterization of shoreline faults by USGS scientists in Menlo (within 600 m of Diablo Canyon nuclear powerplant main power block, just outside the surf zone). Example of need for integrated onshore/offshore studies. Another example is earthquake early warning systems, e.g., Japanese OBS network over subduction zones.


Marcia McNutt’s opening comments:

Marine geohazards is a timely topic given events this year, BP oil spill among others. Two parts to any hazards science program: mitigation and response. Both very important for how science can contribute. USGS has been key in bringing science to bear in both mitigation and response arenas, bringing science products. In mitigation, our poster child is timely intervention of USGS in Alaskan pipeline design – crossing Denali Fault able to withstand M8.0 quake, which happened (2002), and pipeline design successfully withstood it without any spill. When we anticipate a hazard before it happens, we don’t make headlines or get accolades. May be not very satisfying for having done the right thing, but it’s what we need to do. Anticipate problems before they happen. In BP oil spill, we learned doing science during ongoing emergency is not the best time to be doing it, scrambling to find out how dispersant works in very deep water, etc. Have to anticipate scenarios. In ArkStorm scenario planning, Lucy Jones’s and hazards team workshop did well – scenario planning allows you to look at real events and possible ones, learn what science we need to have in hand to respond – in that case to flooded communities. Those results were appreciated by insurance companies, Target, etc. in terms of deciding where to locate stores (others will follow their example in locating out of flood zones), making them more likely to get loans and insurance for those commercial locations. Cascade of understanding through economic chain in California that they understood planning implications after ArkStorm.

But it’s not possible to mitigate every consequence of natural hazards, given how much civilization is already in harm’s way, especially in coastal zone. Cannot mitigate against tsunamis striking all of the population in coastal zone. So, how can we use science to aid in the response? Often, our scientific basis for determining the appropriate level of response is just not where it needs to be. In Chile, the local tsunami produced greatest death toll, yet the tele-tsunami was almost a non-event. We need much better understanding of potential for tele-tsunami generation after such a large quake. FEMA considers PAGER the poster child for what USGS is doing in this regard, helping understand seismiograms, faulting, depths, locations, etc. and translating into simple tool that tells FEMA not just magnitude and location but also how many people, likely death toll, likely economic losses on a timely basis. In terms of response for other kinds of hazards, what is the PAGER-like tool that can be developed, that tells responders critical information they need on a timely basis? For a tsunami, often the runup information is most needed for vertical evacuation decisions.

Think broadly about hazards that we may not have thought about thoroughly. BP oil spill from Deepwater Horizon wasn’t on people’s minds before it happened. Think not only in uniformitarianist way, consider also climate change effects and others too. Methane hydrates causing offshore platforms to destablize, e.g. (Charlie Paull had example, but had no rates of seafloor movement). Use the geologic record to try to understand, with climate change, what are rates of some movements offshore? There have been major methane releases offshore; what hazards do they pose, and at what rates, to offshore wind farms, platforms, cables? USGS should be ahead of the curve deciding what risk those hazards pose to structures offshore.

We need to identify outstanding science issues. Concern that USGS doesn’t always have all the infrastructure we need to address them, fleets of deepwater ships, so need to know what partnerships to build and strengthen: with NOAA, NSF, to achieve our agenda. Recognizing budget limitations will go on for many years. But Marcia is strong proponent of proactively identifying issues that we need to get out ahead of, that affect safety especially. Need to find a way to address this (loudly) and find resources to get the work done.

Question from Sean Gulick: In rapid response to Haiti earthquake, found small, not too expensive things that could have been done ahead of time to reduce/mitigate damage. How do we enable the system to spend small amounts of money on fixes easily identifiable before events happen?

Marcia: in the US, system somewhat broken because of earmark history. This may worsen, where money won’t be spent on anything proactively. But there may be more emphasis on communities deciding at state or local level what to prioritize. That would mean USGS efforts to work with states would be even more important, state geological surveys.

Question from Uri ten Brink: Ship capacity is low now, yet it’s very hard for us at USGS to get on them, we feel like poor relative coming through back doors asking for favors, difficult to cooperate trying to get ship time. Why can it not be easier for us to use taxpayer-funded vessels?

Marcia: Good point. UNOLS fleet funded by agencies using the ships.

Uri: There’s nothing we can do here; we need advocacy at a higher level to get us access to those ships. Chris Goldfinger agrees: NSF is complaining that not enough ship use is happening.

Marcia: worth trying to intervene with OSTP to get some funding for USGS to use the ships. May be able to improve this.

Bill Schwab: Statement of goals

(as written in handouts with workshop materials).

What constitutes marine geohazards? Ruled out including severe storms, gradual coastal erosion, etc. Focus on sudden, extreme geologic events: marine earthquakes, volcanic eruptions or collapses, submarine slope failures. Unintentional man-made venting during oil and gas exploration/extraction, in the context of slope stability problems. Quantifying long-term probabilities, assessments of these hazards. What is appropriate federal, USGS role in all of this? Especially with tight budgets, how do we form more effective partnerships?

Steve Kirby: We will focus on marine geohazards in context of the coastal and marine program, but other programs also involved – earthquakes, in particular national probabilistic earthquake hazard maps of coastal areas are very relevant, and Tsunami Source Working Group understanding subduction-zone earthquake sizes, processes, tsunami hazards. Volcano hazard mapping and geophysics very relevant. Want to emulate what earthquake hazards program has done in national probabilistic seismic mapping effort, apply some of the same principles to marine geohazards. Workshop should formulate approaches to this. CMG is now within Natural Hazards mission area, we have opportunity to sharpen up that relationship as regards geohazards part of coastal and marine program. Part of SSPT activities to formulate ideas for how the realignment can create opportunities to strengthen marine geohazards research. Importance of partnering, and the deepwater vessels issue. NSF GeoPrisms program (successor of MARGINS) has chosen Alaska and Cascadia margins for 10-year focus – this is a real opportunity for USGS to work with university colleagues in bringing in geohazards aspects of their science there. What are our roles and those of our partners? We need to set priorities, an extended process beyond this workshop and report. These are difficult times with polarized politics and budget cuts, but our sights must be long-term. USGS has weathered similar storms in the past and come out okay.

Question from Jody Bourgeois: Are San Francisco Bay and Puget Sound, interior seaways, included in offshore hazards discussions?

Yes.

Cascadia subduction zone
Mark Petersen: talk by Petersen, Frankel, Harmsen, and Hayes.

Probabilistic mapping approach as applied to Cascadia margin, national seismic hazard maps. Many aspects go into developing seismic hazards models, including paleoseismology, potential field data, seismic reflection/refraction, crustal velocity measurements, ground motion studies, lidar, geologic mapping, etc. Go into products including building design, shakemaps, forecasts and synthetic seismograms, liquefaction, landslide, surface rupture maps. Basis for seismic design provisions in building codes, retrofits, FEMA very dependent on them. Emergency preparedness and early warning, land-use planning and earthquake insurance. Products should drive the kinds of science being done. Probabilistic seismic hazard methods (PSHM) examples given: need earthquake size, distance, specify recurrence rates for quakes of each source, how strong are ground motions, and variability. Produce hazard curve that describes probability of having ground motions greater than a certain intensity (annual probability of exceedence) that engineers can use. USGS develops national seismic hazard maps with input from external community. Will have Pacific NW workshop in 2012. Recently held workshop on turbidite data at Oregon State University. This information goes to engineering community, they develop design criteria, international building codes. We are responsible for making sure they get the best possible science for those maps, updated as new science becomes available, and vetted quality. Can’t have fluctuating building codes; retrofitting is much more expensive than designing buildings right in the first place. Make ground-motion models showing how motion decays with distance, e.g., for Chile M8.8 quake last year. Assemble seismic hazard maps now for American Samoa and other areas. Alaska, Hawaii, conterminous U.S., Puerto Rico, Guam and Marianas.

For Cascadia subduction zone, what are the major gaps in knowledge? Need fault geometry, depths, recurrence interval, types of ground shaking that occur there. Olympia Fault – have no recurrence information. Seattle Fault, Tacoma Fault, others? Need more information for all of those. Marine groups could help substantially. Have information on depth of subducting slab (shallow dips) but few large earthquakes recorded so far to define it. Need seismogenic depths, coseismic rupture extent. Have been using ETS models recently to look at stress accumulation and dissipation (Chapman and Melbourne, 2009), thermal models of 350-degree isotherm, geodetic models. Some data sets conflict with others, need more research focus to resolve those conflicts. Use Slab1.0 to infer seismogenic width, get 3D subduction zone geometry to develop consensus models to estimate future tsunamis, earthquake size on Cascadia subduction zone.

New research on Cascadia subduction zone source recurrence: identified 19 events with full ruptures, recurrence interval 500 years in Holocene. But 41 events define Holocene recurrence for southern Cascadia margin of ~240 years. May have to separate northern and southern sections and to compare on- and offshore recurrence. Goldfinger et al. (2010) for southern part of subduction zone. Those differences are very critical to the engineering community.

Priorities they see: looking at more onshore sites for evidence of M8 events. More core locations for turbidites (Hydrate Ridge to Rogue). Tracking turbidites with Chirp data offshore. Alternative correlation possibilities. How much ground motion is needed to trigger turbidites? Can M7 do it? More research into uplift data.

Other questions: where is subduction zone, slab interface, how does it rupture (from historic earthquakes)? More reflection profiling needed. One event or as earthquake clusters? Persistent segment boundaries? What kinds of ground shaking occur? Why do some areas not have deep quakes, e.g. beneath Oregon? Had a big outer rise quake in Samoa/ Tonga trench, but have not modeled any outer rise earthquakes in Alaska or Cascadia – do they occur there? Can accretionary wedge sources rupture in large quakes? (M7.6 in New Hebrides). Can we quantify strike slip faults offshore for use in PSHA – e.g., southern CA?



Chris Goldfinger: Cascadia great earthquake records from offshore (USGS/NSF work).

Location, location, location: Submarine canyons traverse the locked zone, making them sensitive to ground shaking are favored. Often isolated from rivers during highstand conditions (relict, Pleistocene canyons). So can capture Holocene earthquake-generated turbidites as opposed to other causes. Showed Rogue canyon system; broad catchment, feeds onto small apron where they have 10 cores. Also Hydrate Ridge basin west canyon, rimmed by mountains around and is isolated from terrigenous sources now. Use this site to test correlations with other sites, testing earthquake origin in stratigraphic record. Have 29 new cores too. Spatial coverage is important – must be broad. Primary criteria for distinguishing earthquakes are areal extent, synchroneity (within a few minutes to hours at most), and sedimentology. Identified 13 turbidites above Mount Mazama ash horizon that are probably earthquake-generated. Correlations are made based on grain-size/physical property (density, magnetics) “fingerprints” within 14C-age framework (have many radiocarbon ages). Correlations work well, even though depositional environments are not at all connected. Can correlate some of these events along 1000 km of subduction zone. Main structure of turbidites are fining upward pulses (Bouma A-C), capped by fining-upward tail. The pulsing structure is commonly maintained throughout channel confluences. CT imagery shows turbidite structure. Muddy turbidites have sharp bases, fine upward, many are very bioturbated, obscuring detail; those have limited strike extent, tentative correlations indicate extra set of earthquakes.

Also studied lakes onshore with potential matches to marine turbidites. Sanger Lake, CA, and three others, also inlet on Vancouver Island, may also record similar shaking. Briles et al. (2008). Potential for inverting paleoseismic data to develop slip model.

Smith/Klamath area abyssal plain Chirp 3.5 kHz records shows clustering of events (see individual turbidites). Made offshore space-time diagram for the last 2800 years, correlating onshore and offshore data. Also inferred rupture lengths for past 3000 years. Reiterates recurrence intervals mentioned by Mark Petersen. Segment boundaries may correlate with ETS boundaries proposed by Brudzinski et al., 2007, though both sets of boundaries are “crude”. Southern segment: recurrence interval 240-260 years, and 220 years in the past 3000 years. Northern San Andreas Fault recurrence during that time is similar, ~200 years. Found temporal connection between Cascadia and NSAF, where 80% of events in last 3000 years seem to be associated in time in both places. Penultimate NSAF event, 1690-1715 A.D., indistinguishable from 1700 Cascadia event. Could be stress coupling between the two. Why segments in south but not north? Maybe (Ruff, 1985, idea) caused by sediment supply decreases southward, exposing plate roughness and maybe forearc structure (asperities). Blanco Francture Zone roughness entering trench?

Downdip limits have common feature and seaward swing in northern-central Oregon, with cause unknown: heat flow, stress field, structural indicators. Priest et al., 2009, and McCaffrey et al., 2007; Burgette et al. 2009 (tide gage estimate). Inland focal solutions show N-S compression of the forearc. Central Cascadia segment may be caused by pinchout of locked zone, cause unknown. Clusters don’t exist in a temporal sense, not sure yet if statistically significant.

Probabilities: in 50 years, we will have exceeded all the known recurrence intervals of the 41 events in turbidite record.

Long records similarly possible from Sumatra, Iberian margin, and elsewhere. Sediment supply effects on segmentation also applicable to Sumatra. There are also giant landslides in CAscadia in scale of the Hawaiian slides, though the repeat times are long (500,000 years).
Brian Atwater:

Marsh sections show buried paleosol, tsunami deposit, mud from tidal flat inferred to be after land subsided in 1700 earthquake.

Note that healthy marine science programs should contribute to science education efforts (middle, high school, and community college educators).

National Research Council, 2010, Tsunami warning and preparedness – an assessment of the US tsunami program and the nation’s preparedness efforts: National Academies Press.

Dunbar and Weaver, 2008, US States and territories national tsunami hazard assessment – historical record and sources for waves: National Tsunami Hazard Mitigation Program.

Wood, 2007, Variations in city exposure and sensitivity to tsunami hazards in Oregon: USGS SIR 2007-5283.

Risk assessment of national tsunami risk recommended by NRC – vulnerability x consequences. This is not NOAA’s role; they do risk assessment once the tsunami is under way.

2010 Chile earthquake and tsunami, and other international work. Run into tension between “scientific imperialism and scientific diplomacy”. A national program here on marine hazards would have to have some international component; how that plays out could be tricky. Dealing between scientists and other hazards people (domestic and international, state and local partners).


Comment from Jody Bourgeois: Kuril/Kamchatka tsunamis affects US territories, have caused damage. Chile also. Not only is international work important, but those subduction zones generate tsunamis that affect US and its territories.

Steve Kirby: those giant earthquakes are infrequent enough that we must study them worldwide.

Marcia McNutt: Mark Petersen’s talk on hazard maps, shakemaps reminded her of what we can do for scientific basis for investment in reducing hazards. USGS has a lot of information about where we are most vulnerable, because we have information in terms of probabilistic forecasting in 30, 50, 100 yr recurrence rates for faults. Have done detailed ground motion forecasting in CA, and worked with engineers putting sensors on bridges, buildings, studying motion in small quakes. Use for deciding which structures need refitting sooner rather than later. But when we look at e.g., 1700 Cascadia quake, brings up question of very infrequent events. Much of our infrastructure investment is directed at smaller, more frequent failures where we know we have return on our investment that is easily justifiable within the lifetime of that building – saving lives, saving building. But when we have many iterations of that building before a large quake happens, need more discussion of how much engineering is practical and a good investment.

Steve Kirby: Sometimes mitigation of those is not that costly, and is effective at saving lives, but may be different than our California experience.

Bruce Jaffe: question for Chris Goldfinger on segmentation. Has anyone looked at onshore tsunami deposit record for segmentation correlation?

Chris Goldfinger: Yes, looking at Bradley Lake, have nearly identical onshore and offshore records. Size of events seems to match well too based on extent of sand sheet across lake, mud rip-ups etc.

Michael Hamburger (Indiana University): State of the art of using GPS data, dealing with inconsistencies between GPS and paleoseismic data or models in interpreting Cascadia processes?

Mark Petersen: GPS data has recently become useful, 20 yrs of measurements. They do correlate well with onshore geologic structures. In CA, few discrepancies. Elsewhere, including Pacific NW, correlations tougher and can have factor of 2 difference. Models are very valuable for insights into strain, slip velocities; but need to understand alternative models too. Still working on how to deal with discrepancies, plan to have a workshop to address this later this year.

Sean Gulick: are we working our way toward local tsunami hazards (landslides too) as well as transocean hazards in an integrated way?

Brian Atwater: By 1996 or 1997, this kind of mapping was under way. Goes state by state, states use somewhat different approaches. NRC thought there should be more uniformity in approach, workshops held periodically to review tsunami sources nationwide and how they are treated.

Craig Weaver: in WA, governor makes policy decisions – there’s only one evacuation line, same for tsunamis and earthquakes. OR has two evacuation plans, one for local and one for extreme Alaska events. Policy decisions different in different states. WA is more interested in whether we underestimated extreme runup extents based on what happened in Indonesia. For much of the coast, dealing with far-field sources is a bigger deal than local, because much of the coast doesn’t have local tsunami sources.

Sam Johnson: We expect a lot of shaking from next Cascadia major event. Subsidence is also a major hazard from these events. How well do we understand that hazard and communicate it?

Brian Atwater: Much is unknown about shaking levels, liquefaction. As for subsidence and land level change, consequences would allow storms to attack beaches, beaches would recede for a while afterward. Some groundwork had been laid for considering that; in framework similar to that of sea level rise. Instantaneous SL rise with some probability.

Craig Weaver: in a previous event, FEMA issued notice to mariners that seafloor had moved up or down ~1 m.

Guy Gelfenbaum: GPR profiles across Cascadia coast, saw erosional scarps 100s of m behind current shoreline, can correlate those with subsidence events that Atwater worked on. In Sumatra, saw coseismic subsidence of 1-2 m in some places. Erosion/accretion time scales after those events is not well understood.

Steve Kirby: are turbidity currents and tsunamis on land erosive enough to lose part of the deposit?

Chris Goldfinger: Yes. That’s why you need many cores with broad spatial distribution; see occasional turbidites vanishing from one site to the next, probably obliterated by the turbidite above it, maybe 5% of the time but it does happen.

Brian Atwater: oxygen also damages buried soils, but can find ways around this problem usually.

Jody Bourgeois: tsunamis can be very erosive, some scarps may have been generated by tsunami erosion (referring to Guy’s comment above). That’s at least part of the coastal hazard they pose.

Holly Ryan: Offshore San Andreas Fault system.

Provide information for hazards studies, CA earthquake probabilities. Near surface, higher resolution studies – age of fault offset, slip rates, recurrence intervals, fault length, orientation, geometric complexities. 3D geology of fault zones (seismogenic zones), National Archive of Marine Seismic Surveys (NAMSS), offshore well data, geopotential data, microseismicity. Areas for possible future work: San Onofre, North of Mendocino triple junction, southern CA borderland south of the border. Refer to CA Geological Survey fault activity map of 2010. Many faults known to have Quaternary offset but are undated, offshore. We really need dates. Age of fault offset depends on CORING, plus good geophysical data. Dating the cores. Need slip rates (acoustic trenching), coop with MBARI. Globally referenced seafloor geodetic station (developed at Scripps) as alternative way to get offshore fault slip rates. Recurrence intervals.

MBARI’s AUVs collect very high resolution multibeam data to map seafloor features at 10s of cm scale, can calculate slip rates on offshore faults. E.g., Palos Verdes Fault slip rate estimated from Chirp record and gravity core, dated with 14C ages on benthic forams in core – also gives slip rate and sedimentation rate both of about 3 mm/yr. San Diego Trough fault zone; cores collected by ROV in channel wall, working on slip rates for faults there now. This fault zone is currently not included in Working Group on CA Earthquake Probabilities (WGCEP).

We really need deepwater multibeam data throughout this area. Have recent multibeam bathymetry on slope offshore San Diego county. Can only go down to 800-900 m (slope base) now. See drainage patterns, canyons, and can look for offset in those. Could partner with NOAA and UNOLS ships, but really need more multibeam data. Is it likely that San Diego trough and San Pedro Basin faults will rupture in a M7.8 earthquake? Are there barriers along the fault zone that would preclude rupture all along it? This is where 1986 Oceanside earthquake swarm caused $1 million damage (restraining bend of San Diego trough fault zone).

Provide information for hazards studies on northern SAF between Point Arena and Mendocino (with Sam Johnson). Really nice multibeam data showing seafloor expression of active faulting, where present (e.g., there’s no surface expression offshore Golden Gate).

Hayward-Rogers Creek fault stepover. Urban fault junction, we know almost nothing about its continuity below 2 km depth. (Tom Parsons will discuss further on Wednesday). How does this affect maximum rupture possibilities?

3D geology of fault zones: deep penetration reflection/refraction more difficult than in the past because of permitting requirements. NAMSS outer continental shelf well data useful for understanding geology at depth. 3D finite element modeling, gravity and magnetic data, and microseismicity. NAMSS data set is very extensive. As part of San Andreas 3D, 4D work (Geology & Geophysics, Earth Vision), worked offshore Point Reyes getting depth to top of acoustic basement.

Long-term slip simulations using 3D kinematic models can be used to investigate vertical deformation rates as a consequence of fault geometric complexities (work with San Francisco State Univ.). Grove et al. (2010) measured Quaternary uplift of marine terraces at 1 mm/yr. North of Point Reyes, subsidence is indicated by several transgressive surfaces (Quaternary). Slip steps back over into SAF as you go around Point Reyes.

Microseismicity being used by Jeanne Hardebeck to study San Luis Obispo earthquakes, located using 1D velocity model, double-differencing techniques. See earthquakes align along lineations.

Would be great to have real-time OBS stations offshore to identify fault interactions.

Janet Watt’s work: towed magnetometer data, combined with topo/bathy data near Diablo Canyon. Use those to determine long-term fault offset, dip, and continuity. Marine magnetics and gravity data really help and are low-cost.

This spring, we are collecting magnetic and gravity data in Puget Sound too. Quaternary faults there affect basin sediments, should be readily identifiable in gravity.

Future work? NRC: SONGS (nuclear regulatory work, Southern California Edison), providing expertise related to relicensing San Onofre nuclear facility. Find out whether Oceanside blind thrust is an active fault. The nuclear facility was built for M7 on Newport-Inglewood fault, but not for those blind thrusts. What’s the extent of that structure, and is it active? Have mini-sparker data (USGS) and AUV survey (MBARI), and tried to core (unsuccessfully). Resolution matters!

Also Mendocino triple junction, where Mw6.5 earthquake happened January 10, 2010. USGS did not do much, but should we put out an OBS to capture aftershocks from events like this? Should we do rapid-response offshore mapping of surface ruptures? That area had 18-19 earthquakes Mw5.6 to 7.7 between 1980 and 2008. Broad deformation zone, intersection of 4 blocks. Should we focus on this area for research?

Southern CA borderlands south of US/Mexico border, do cooperative work with Mexico and academia to really understand that system. Neotectonics near Baja California. How is slip partitioned from San Miguel and Agua Blanca fault zones? Baja California shear zone – not completely attached to Pacific Plate. Borderlands workshop at SCEC. Native Americans called the area “Earthquake Bay” just north of US/Mexico border.

Question from Steve Kirby: Putting out OBSs seems very important. If there are earthquakes on those faults, obviously says they are truly active. Japan does rapid response surveys offshore routinely after earthquakes and they know much more about their earthquakes as a result. We are hearing the theme of needing high res bathymetry in many areas. In terms of priorities, San Diego area seems like should be high given population, infrastructure (Holly agrees and wants high-res bathymetry over entire fault zone).

Brian Collins: OBSs in 2007 earthquake were essential for determining effects.

Holly Ryan: San Onofre might provide opportunity to deploy realtime OBS.

Sean Gulick: We are missing much data inside CA waters because of permission problems inside the 3 mile limit.

Sam Johnson: In the last 2.5 years, CA has initiated seafloor mapping program that is conducting high resolution seafloor mapping. Much of the mapping in CA state waters for that program that Holly mentions has happened already but the data has not been released yet. West Coast Governors’ Agreement on ocean health is sponsoring this type of data collection in CA, OR, WA. Includes mini-sparker and Chirp data at 80-1250 m line spacing over about 40% of the CA coast. Oregon state waters now about 50% mapped. Sam and Chris Goldfinger are involved.



Download 157.74 Kb.

Share with your friends:
  1   2   3   4   5




The database is protected by copyright ©ininet.org 2024
send message

    Main page