Showing posts with label online science labs. Show all posts
Showing posts with label online science labs. Show all posts

Monday, November 14, 2016

NGSS and Smart Science Education

The Next Generation Science Standards (NGSS), with their emphasis on investigation, are forcing states and districts across the country to review and revise their science curricula.  Professional development (PD) for these standards has taken front seat ahead of other science PD.

The Smart Science® online lessons were developed years before NGSS and even before the famous “America’s Lab Report” (ALR) was published.  Yet, they fit well with both the ALR recommendations and the NGSS requirements because they were created by scientists whose greatest concern centered on students understanding the nature of science rather than memorizing long lists of science vocabulary, formulas, and procedures.

The NGSS puts forth its recommendations with three areas of information.

  1. Performance Expectations
  2. Foundations
  3. Coherence

The first area, Performance Expectations, sets forth specific topics and their expectations.  However, unlike old standards, these expectations begin with words such as “Construct,” “Conduct,” “Develop,” “Apply,” and “Plan.”  Smart Science® experiments support these expectations.

The second area, Foundations, explains that every Performance Expectation may support a Practice, a Disciplinary Core Idea (DCI), and a Crosscutting Concept.

The third area, Coherence, connects each Performance Expectation to other Performance Expectations and to both ELA and math standards in the Common Core.

You will find references to modeling and to engineering design throughout the standards as well, illustrating that NGSS relates well to the STEM movement.

How will Smart Science online science lessons help teachers meet the NGSS standards?  The answer is — in every way, and they’re getting better.

First, some background on Smart Science lessons will help in following the explanations.  These lessons use a 5+1 learning pedagogy.

  1. Engage students with a video, modest text, and some opinion questions related to the lesson topic.
  2. Challenge students to make predictions, which may be based on models of data behavior while providing plenty of background material that students may use to help decide on the predictions to make.
  3. Interact with real experiments to make hands-on measurements; students may choose to use only some of the available experiments.  The data are shown in both graphical and tabular formats.
  4. Answer questions designed to ensure students were paying attention during data gathering and to delve into the deeper implications of those data.
  5. Write about the experimental experience (constructive response) in a series of prompted text areas.
+.   Explore implications of this lesson material in other areas through an open-ended project-based activity.

Performance Expectations

Smart Science Education will be expanding its coverage of the NGSS performance expectations to fill in a few minor gaps.  The technology already has the tools necessary.  For example, it has used online activities not involving measurement but still using real images and videos.  It also has the feature of “wet” labs, also known as do-it-yourself (DIY) labs.  These are done in the classroom or kitchen away from the computer with results being entered into the computer afterward.

Foundations

Foundation #1: Eight Practices

1. Asking questions and defining problems

Every Smart Science® experiential (experiment-based) lesson starts with some focusing questions followed by predicting outcomes.  These both request that students ask questions.

2. Developing and using models

Many of the lessons have qualitative or quantitative predictions, models of the behavior that may occur.  The selected or written prediction is kept before the student during the experimentation phase to ensure that the chosen model remains uppermost while measurements are being made.  Results may prompt a student to modify that prediction.

At the teacher’s option, a curve fit may be made to the data values.  This fit represents a model that students should relate to the actual phenomenon being studied.

3. Planning and carrying out investigations

Students may choose among experiments and then measure each value collected interactively.  Their care in measurement affects the data quality.  Some lessons have students deciding how to categorize results.  After finishing with one experiment, students choose which one to do next.

Many students will do every single experiment, a sometimes exhausting exercise in the more advanced lessons.  Teachers guide students to choose carefully as scientists often must do.

4. Analyzing and interpreting data

The questions after the investigation and the online written lab report encourage students to figure out what their data mean.  Making measurements may not fully engage students in thinking about the lesson topic.  They must be queried and then asked to put the results and conclusions in their own words, which is exactly what the Smart Science® system requires.

5. Using mathematics and computational thinking

Some science lessons are qualitative; others are quantitative.  The quantitative ones involve mathematics in a number of ways.  They may request mathematical data analysis or understanding terms such as period, amplitude, and phase.  Lessons with multiple achievement levels will have more mathematics at the higher levels.

6. Constructing explanations and designing solutions

The report phase of each lesson has the purpose of asking students to construct explanations for what they observe.  They are prompted to do so in a series of text areas that may be augmented with essay questions about the specific lesson.

7. Engaging in argument from evidence

Teacher materials encourage teachers to have their students present their data and conclusions for an entire student group.  Teachers then can monitor the discussion to ensure that all arguments arise from evidence and not conjecture.

The reports also provide a mechanism for students to review their data and consider it as evidence to support their conclusions.

8. Obtaining, evaluating, and communicating information

The extra exploration activities extend the lesson experience.  They often require students to seek out information and write about what they find out.

Foundation #2: Disciplinary Core Ideas (DCI)

These are too numerous to list here.

Here’s one example from fourth-grade physical science.

Practice: 4-PS3-1. Use evidence to construct an explanation relating the speed of an object to the energy of that object. 

DCI: PS3.A. Definitions of Energy — The faster a given object is moving, the more energy it possesses.

Smart Science example: Pendulums and Energy.  This lesson compares the kinetic and potential energy of a pendulum bob as it swings and uses the student measurements rather than a theoretical equation.

Here’s one from fourth-grade life science.

Practice: 4-LS1-1. Construct an argument that plants and animals have internal and external structures that function to support survival, growth, behavior, and reproduction.
DCI: LS1.A. Structure and Function — Plants and animals have both internal and external structures that serve various functions in growth, survival, behavior, and reproduction.
Smart Science example: Stem Structure: This lesson examines the structures of many plant stems in detail to correlate functional aspects of the stem with the appearance of the structures.
Finally, here is one from fourth-grade earth and space science.

Practice: 4.ESS2-1. Make observations and/or measurements to provide evidence of the effects of weathering or the rate of erosion by water, ice, wind, or vegetation. 

DCI: ESS2.A. Earth Materials and Systems — Rainfall helps to shape the land and affects the types of living things found in a region.  Water, ice, wind, living organisms, and gravity break rocks, soils, and sediments into smaller particles and move them around.

Smart Science example: Erosion and Slope: Examine the erosion channels with differing slopes in a stream table.

Foundation #3: Crosscutting Concepts

1. Patterns

Patterns exist everywhere in the lessons, from the daily tides to seed germination.  Each lesson involves graphical display to help elucidate patterns.

2. Cause and effect: mechanism and explanation

You can hardly analyze science experiments without seeing cause and effect.  The daily tides lesson encourages students to figure out what causes tides.  Elastic and inelastic collisions lessons require analysis to figure out what quantities are conserved.  Determining molar ratios from precipitates has a similar outcome.

3. Scale, proportion, and quantity

Scale is a great topic for Smart Science® lessons.  Certainly, they cover the impact of time quantity on falling objects, and you’ll find many other examples as well.

4. Systems and system models

Each experimental lesson addresses a system, whether it’s biological (e.g. seed germination and pollution), chemical (e.g. electrochemical series), physical (e.g. collisions), or earth-based (e.g. tides).

5. Energy and matter: flows, cycles, and conservation

A number of lessons address conservation of various quantities.  More are being prepared.

6. Structure and function

You find this feature being addressed in some lessons, such as Compound Pendulum.  This area will be expanded as the system grows beyond the current 250 lessons.

7. Stability and change

Feedback plays a critical role in stability.  Students should understand the difference between positive and negative feedback.  Some Smart Science lessons illustrate this effect and more are being prepared.

The Nature of Science
The nature of science perfuses the entire Smart Science® system.  It’s hard to escape when you’re doing real experiments when you make predictions before beginning experimentation, and when you have to explain your results at the end.  The quizzes before and after the experimentation ensure that the students focus on what’s important and figure out principles rather than memorizing derived “facts” and formulas.

Engineering Design

Smart Science® lessons are being built to incorporate more engineering activities.  The system already has the technical capabilities to carry these out.  Engineering, in this context, involves two different sorts of activities.  Students must investigate specific properties of objects used in the engineering tasks.  Students also must design and create solutions to problems using these objects while understanding their properties.

The first of these activities fits nicely into the same template as the science lessons.  The materials and measurements are real.  The second fits into our “wet” lab (aka DIY) template.  Many wet labs have already been incorporated into Smart Science® lessons.  Many more will be built to focus more closely on engineering.

A. Defining and delimiting engineering problems

B. Designing solutions to engineering problems

C. Optimizing the design solution

Science, Technology, Society, and the Environment


The Smart Science® system has a series of environmental lessons that address some of the issues surrounding this topic.  More are being added.

© 2016 by Smart Science Education Inc., U.S.A. www.smartscience.net

Sunday, December 08, 2013

Smart Science® Labs Go Mobile

In a major new release of Smart Science® online hands-on labs, they're now mobile with HTML 5.  Using the Google Web Toolkit, the new software is more accessible for the handicapped and available on a long list of devices.

The world's only online hands-on labs and best way to learn science are now available on a long list of web devices including:
  • Android tablets
  • Android phones
  • iPad
  • iPhone
  • Chromebook
  • Laptops and desktops
    • Linux
    • Windows
    • Mac OS X
    • various Unix systems
In fact, any system that supports the CANVAS and VIDEO tags of HTML 5 will run Smart Science labs now. Just be sure that Javascript is enabled.

For a quick preview and check of compatibility, see our home page and click on the "TRY OUR NEW HTML 5 LITE DEMO NOW"  link in the upper right corner.

Smart Science online hands-on labs have been bringing real science to the online world for over a decade.  With more than 150 labs and different reading and math levels for content, these labs will meet your science learning goals, including NGSS and America's Lab Report.

Finally, you can have the world's best science learning at your students' fingertips anywhere they have Internet access and on their own devices.

© 2013 by Smart Science Education Inc., U.S.A. www.smartscience.net
Follow this author on ETC Journal

Thursday, February 07, 2013

Remote Robotic Labs and Smart Science® Explorations

Recently, someone asked about the differences between remote robotic labs and the Smart Science® exploration online hands-on labs. This question may arise in the minds of many.

The various remote robotic labs (RRL), including MIT's iLab, are different in some important aspects. Because our approach is so different, educators often have trouble understanding the differences compared to approaches with which they may be familiar.  This explanation should help to clear up any questions.

A RRL provides data automatically. You set up your parameters, whatever those may be, and push a virtual button. Often, you see nothing transpire at the remote site. After a brief pause, you're handed a sheaf of data electronically. For advanced students, that may be just fine, but for ordinary students, all of the trouble of setting up the RRL has been wasted. You might as well have stored the data from yesterday (or last year) along with any imagery and provided that. In that event, you could have just provided this information locally. The students wouldn't know the difference and probably wouldn't even care.

RRLs have limited range. They cannot do Sordaria crossing over or seed germination experiments. You can imagine doing tides, but the real-time aspect is lost because students are not there in real time the entire time that data are being captured. And so it goes. You cannot base an entire biology or chemistry course on just RRLs.

RRLs have limited access. If you attempt to scale RRLs, you must have more pieces of expensive or unique equipment. Depending on the precise experiment being run, the time that the machine is available controls how many students can use it during a given hour-long period. It's not unlimited. You know that you cannot deliver to a million students per hour and probably not even to a thousand.

Our approach takes the online hands-on lab (OHOL) path. We toss out the pretense of real-time experiments. (I say pretense because there's always a delay between data capture and arrival at the student workstation.) In its place, we open up entire new vistas of learning science.

The OHOL way does not deliver data automatically. Students truly must interact to take their own data. As in the tides example, those data are different for different students doing the same experiment with the same parameters.

With OHOL, you have a visual experience. With tides, you watch the actual tides and then measure them yourself.

An OHOL can be created for any experiment you can record on video and take data from. The data may be quantitative, semi-quantitative, or qualitative. They are your data, not those of a machine. The experiment videos may be from a high-speed camera or from a time-lapse camera. They may even combine multiple cameras as with the shadows lab where one camera follows the Sun with a fish-eye lens and the other tracks the path of a shadow.

What do OHOLs and RRLs have in common? None of the data are invented. They all come from the real world. The various forms of real wet labs also have this feature. However, only manual wet labs and OHOLs are truly hands-on in the sense that you take your own data point by point.

Our technology allows for an unlimited number of scenarios. We're only limited by our imagination and our resources. We have done as many as 100 experiments to create one lab. The number of experiments available is also a function of the pedagogy. Students can be confused by having 30 experiments available. Some will think that they must do every one rather than exercise judgment (actually think) despite our telling them otherwise. It becomes the instructor's task to handle this issue because instructors control grades, and students who do every single experiment available are doing so because they think they'll improve their grades. The instructor must convince them that lack of thought will reduce their grades. Our best efforts cannot do so because we do not hand out grades.

There's much more to this picture. For example, we insist on students making predictions before beginning experiments. We provide introductory (pre-lab or formative) assessments and summary (post-lab or summative) assessments. We provide extensive background resources and an online lab report that can be customized for your classes.

The above is not to say that RRLs have no value. On the contrary they are the go-to labs of the future for college engineering courses. They open up the use of expensive equipment that many schools cannot afford to undergraduate engineering students. They have limited use for college science courses. The limitations are those of the medium that requires complete automation and relatively quick experiment completion. They're of little value in K-12 education. You can find better ways to learn any science concept at that level, with the possible exception of advanced or honors courses and then, as with college science, only with a very few investigations.

© 2013 by Smart Science Education Inc., U.S.A. www.smartscience.net
Follow this author on ETC Journal

Wednesday, January 23, 2013

NGSS Have Problems

You can read a review of NGSS at http://etcjournal.com/2013/01/22/next-generation-science-standards-fall-flat/.  However, that's not the entire story.  Here's the rest of the story.

In the NGSS, "crosscutting concepts" are concepts that span all disciplines of science and engineering and help, according to the authors, to tie the standards together.  As a chemist, I look at those associated with chemistry standards.  I also look most closely at high school standards to see what the highest level of the standards do.

The crosscutting concepts in high school chemistry (Structure and Properties of Matter and Chemical Reactions) are listed as follows:

  • Cause and Effect
  • Systems and System Models
  • Energy and Matter
  • Structure and Function
  • Stability and Change
  • Patterns
The one crosscutting concept I see missing here is Obtaining First-Hand Data from the Real World.

Science is about exploring the real world and is an open exercise that explores what really happens, not what should happen in an ideal system.  While ideal systems are used as models against which to compare real data, scientists don't really care about models except as a tool.

Here's one sample standard that exemplifies the approach of the NGSS.

Analyze and interpret provided data about bulk properties of various substances to support claims about the relative strength of the interactions among particles in the substance.
 The standard does not specify whether the provided data are to be real or manufactured.  In this instance, you might infer that data are real.

In the section on Forces and Interactions, you'll find the following standard that is much less clear.

Analyze data to support the claim that Newton’s second law of motion describes the mathematical relationship among the net force on macroscopic objects, their mass, and acceleration.
 From where do these data arise?  Is it from student experiments, from teacher experiments or demonstrations, or from a formula?  Very often, the data will come from a simulation, e.g. a formula.  You can expect teachers, when allowed by their state and local standards, to resort to this easier and more "reliable" approach whenever possible.

What does it mean to analyze manufactured data?  Here, you're using F=ma to generate data, and those data are then used to infer that the model they represent is F=ma.  This sort of thing is ludicrous.  I'd use stronger language but refrain out of respect for the reader.

This is a closed cycle.  A formula generates data that are used to verify the same formula.  In science, however, it's always an open system.  Data come from the real world, or as America's Lab Report  (ALR) says, "the material world."  Indeed, ALR insists that data originate in the material word in order for an activity truly to be a science investigation.  Ultimately, these data are analyzed and may result in a model of the real world.

The difference is as night and day.  Where ALR focuses on student actually obtaining their own data for the most part, NGSS has students working with provided data.  Is there no hope?

Later on, the following standard provides some relief.

Design and conduct an investigation to support claims about how electric and magnetic fields are created.
Here, students must do experiments and collect their own data.  However, there's one minor problem as the Clarification Statement shows.
Qualitative observations only.
So, here is the one actual piece of lab work in HS.Forces and Interactions, and it's entirely qualitative.   You cannot do much with purely qualitative data.

Finally, under Energy, you can find a real lab.

Design and conduct an investigation to support the claim that the transfer of thermal energy between components results in a more uniform energy distribution among the components of a closed system
In this standard, students are requested to "[use] mathematical thinking to describe the energy changes both quantitatively and conceptually."

That's it for the physical science portion of the standards.  One quantitative investigation and one qualitative one -- for an entire year of physical science or for two years of chemistry and physics.

To be fair, these are "core concepts," and states, districts, and teachers are free to add to them and extend them.  However, if the states and districts do not mandate laboratory investigations, then teachers will tend to avoid the extra time and budgetary stress of true lab investigations.

I find these standards to be rather shallow for leaving out important concepts (e.g. the mole) and for failing to insist on more first-hand quantitative investigations.

They've become so enamored of their cross-cutting concepts and of integrating engineering into science that they've lost the very essence of science.

© 2013 by Smart Science Education Inc., U.S.A. www.smartscience.net
Follow this author on ETC Journal