Author Archive

NANO Institute 2011

Friday, July 8th, 2011

posted by: mcrocker

This year’s NANO Institute should be a great opportunity for more local teachers, students, and schools to increase their exposure to the world of the nanoscale through the works of NDeRC’s NANO Collaboration.  While the makeup of the NANO Collaboration is changing, for the summer, we will be in full force!  The institute will be better than ever.   As for the possible school year NANOweeks,  the experiences from the past year as well as the in-roads we made with other teachers and schools should allow for the continued success of NANO for the future.  NANO is the new BioEyes!

Be the first to like.

NurtureShock – 2nd Cut

Friday, May 13th, 2011

posted by: mcrocker

Looking back at my two years of teaching and outreach through the NDeRC GK-12 fellowship, I wanted to reflect on what I experienced.  I also promised a second look at the book Nurture Shock.  In the first blog post I wrote about this book, I took a critical look at the first 4 chapters.  I definitely used a harsh tone in that post, and I would like to revisit the last 6 chapters of the book from a different standpoint.  The book is filled with hints about how to successfully teach children.  I find that these suggestions are often quite obvious:  don’t lie to children, make sure children get the sleep they need, be clear when teaching children by using language that they understand, etc.  I was shocked because the conclusions in the first 4 chapters seemed like they could be summed up as follows: be honest, loving, and thoughtful.  Even when I was the child being taught, I knew that there was no magic to teaching well or raising children well.  Simply be patient, loving, and selfless.  Using business tactics and focusing ONLY on “results” will backfire when it comes to human learning.

This time, I finished reading the last 6 chapters, and I looked at the content from the perspective I gained during my two years in the fellowship.  During this time I traveled to many schools in the area, and I heard many stories from teachers and students.  While many of the conclusions in the book seemed obvious to me, the authors go to great lengths to explain these conclusions in a very scientific way.  It turns out that these conclusions are not obvious to a great many people in our society and thus not obvious to a great many people in our schools.  (For example, some schools test children in pre-school in an attempt to gauge their future academic success and lock them into the “advanced” classes.)  I have noticed that in stories about award-winning teachers, the teachers are described as showing concern and care for their students, rather than implementing a perfect lesson plan.  It turns out that the human brain is designed to feed off of honesty, conflict resolution, and positive reinforcement, not negative fear training or one-size-fits-all approaches.

One of the most telling conclusions from the book (after reading the whole thing) is that a child’s brain does not work like an adult’s brain.  That does not mean that children should be babied, but rather that children should be treated like they are full of potential, not bracketed into categories of achievement.  At any point before adulthood, is it very common for children to make rapid improvements no matter how under-performing they were before that period of growth.

Another key point in the book is that children learn from everything, including negative experiences.  In fact, the biggest failing of children’s books and attempts to shield children from hardship is that they take away or reduce the many natural opportunities for children to see conflict resolution.  The key is to repeatedly show children the RESOLUTION of a conflict situation more often than the conflict itself.  Said in another way, failure is not good or bad.  The only real failure is forcing children to think that perfection is something to strive for.  The pursuit of excellence has nothing to do with never making a mistake, rather is about weaving solutions and beneficial results into a world that GUARANTEES mistakes.  If a student appears perfect, it is time to find out what is really going on.  Parents need to get the message just as much as teachers.  Raising children is about helping them learn how to deal with hardship.  Consistent arguments between parents and children with constructive resolutions is the way things SHOULD be.  Punishment should not be seen as a reason to despair.

Even for infants and toddlers, a parent needs to be firm and consistent.  If parents want progress, they should reinforce good developments.  Not everything a baby does is helpful, so why reinforce actions that regress or stall development?  It is problematic to reward everything and everyone all the time.  Parents and teachers who reinforce both good and bad behavior are just producing children who cannot easily discriminate between progress and regression.  In addition, there is no such thing as “wasting” time when teaching.  For training and learning to work, there has to be intense time to gain skills and knowledge and significant breaks in between.  As the NDeRC fellows learned in our brain workshop, trying to learn two similar skills without sleeping in between actually confuses your mind and body and makes you worse at that skill.

I am so glad to have had the NDeRC experience for the last two years.  It has helped me teach students, teach my own children, and teach myself.  It has improved my confidence, and it has been a very positive experience for me.  I have often found that those who have been involved with the NDeRC project have increased their ability to learn on their own and have gained the ability to increase that skill in others.  I hope that I can continue such a trend where ever I go in the future.

2 people like this post.

Computers and Business

Sunday, April 3rd, 2011

posted by: mcrocker

When I visited Penn High School on Thursday, one of the students approached me after class to talk about my research into nanomagnets for computing.

The thing he said that interested me the most, however, was that he was looking to study Computers and Business at IUSB.  A lot of people have told me that getting CS and Business people to work together is a collaboration that should happen more often.  My father is a Chemical Engineer, and after entering the workforce he began to wish he had more business/management skills.  There are entrepreneurial efforts at a lot of schools, including Notre Dame.  The big startup success stories often involve an unlikely partnership between a technology guy and a business/marketing guy.

I wonder if there is a way to take advantage of such a combination when doing STEM outreach…

1 person likes this post.

Community Visibility

Sunday, February 27th, 2011

posted by: mcrocker

The Notre Dame extended Research Community (NDeRC) is finishing up its fourth year.  In that time the people in the program and those who have been involved with the program have accomplished so many great things.  After-school activities, enhanced curriculum, hands-on science activities, and guest visits from research graduate students.  All of these contributions have enhanced student learning and, hopefully, encouraged every student to see science and mathematics in a new light.  But one of the biggest accomplishments of NDeRC in these four years is spreading the NDeRC name, and, for the long run, spreading the Michiana STEM name.

Michiana, the state of Indiana, and the U.S.A. as a whole are all facing serious challenges in K-12  education.  Many school districts, and thus, many schools are facing funding cuts.  There are more students to teach and less resources to do it with.  Until the funding situation changes (and who knows what will change that?  better economy?  change in priorities?  change in how schools are organized?), something else must change to turn the tide toward better learning environments and better “outcomes.”  NDeRC has made a big difference in the lives of thousands of students in the Michiana area, and is a good position to continue and expand that positive influence.

There is a large demand for well educated STEM students and STEM professionals.  In some ways, cultural forces are creating this demand (or scarcity of STEM professionals to be employed), as youth are told that scientists and engineers are “nerdy” or socially inept, thus unworthy of aspiring to.  When, in reality, STEM professionals include doctors, business people, innovators who create new technology, advisers to government officials (or the officials themselves), environmentalists, historians, even Indiana Jones!  If students knew that by studying  science and math they could do so much, many more would want to.  Not every scientist is mixing chemicals in beakers, and those that do have a lot of fun doing it.  NDeRC activities in the classroom act as a dynamic “awareness week” for students.

Students in traditional schools spend an overwhelming amount of time with peers that are almost exactly their own age, and with adult teachers.  Sometimes they see students just above and just below in age, and students from big families get to see siblings of different ages too.  However, the half generation above (people who are about 10 years older) are almost invisible.  By letting K-12 students see what graduate students are doing, they can more easily imagine themselves doing something similar.  Graduate students can be the role models for the next half generation in a unique way.

Along those same lines, NDeRC is leading the way for how a community sees education.  Colleges and universities often are very separate from the nearby communities, and Notre Dame is no exception.  There are ways that Notre Dame does outreach to the community, but in general it is not very visible.  In fact, besides the local work that Notre Dame does, there is much that Notre Dame does far away from the Michiana area.  NDeRC is trying to change the visibility as much as possible.  NDeRC is more and more showing up in teacher-teacher and teacher-administrator conversations, in school newsletters, and even in local newspapers.  The more the better.  There are lots of people and groups out there helping to enhance education, but they are often not aware of the others out there.  Instead of a continent, there are only islands.  With NDeRC’s vision of STEM community, NDeRC needs to be leading the way for visibility as well.

What media outlets could be targeted?  ND newspapers and magazines?  More inside connections to local papers?  Word of mouth is working.  We are growing our internet presence.  There is a lot to do.

1 person likes this post.

Science Fair

Wednesday, February 2nd, 2011

posted by: mcrocker

I recently judged again at a local science fair, and I could not help but be curious about the scoring process that I was involved in.  I know that I am second guessing myself a lot recently, but I was really worried that none of the projects I judged were included as any of the winners.  I would say there were about 30 projects, and about 10 winners, so it could just be random variation.  However, I talked to some of the other judges there, and I felt that there was a significant difference between my group’s average rating ~45 and the other group’s average ~55.  The points were out of 60, and other judges told me that 60 was awarded multiple times by their group.  The highest my group awarded was 49.  It made me feel bad, almost that I should have known to judge differently, or something.

After further reflection, however, I decided that there is nothing to suggest that I judged wrongly.  There might have been much better science fair projects than the ones I judged.  Also, I trust that if there was a really excellent one (or two or three), it (or they) would have won, no matter any sort of scoring inflation or inconsistency.  Also, the organizers might have had something in place to account for the differences in the judges.  I hope this is just a symptom of my current woes, which stem more from me approaching my Ph.D., dealing with big-picture life issues, and dealing with the long waiting period of a very competitive job search.

2 people like this post.

Job Search Crazyness

Monday, January 3rd, 2011

posted by: mcrocker

The plan is to graduate in the spring with my Ph.D.  I have been looking for a faculty job at various colleges and universities.  I found 50 that looked good, but as deadlines passed and I started to apply, the list shrunk down to 36.  I have applied to 24 and have 12 more to do if everything works out.  In the meantime, I have to wait.  I have no idea how many interviews I will make the cut for.  I also do not know what the timeline will be like assuming I get any interviews and any job offers.  Part of me is worried that I will not get many interviews and zero offers.  Another irrational worry is that I will be forced to make a decision on a single offer wondering if I am still in the running with other schools.  For job searches in general it is preferable that offers are given at around the same time so that one can make an informed decision.  I guess time will tell.

In any case, it is interesting to sum up the process with a picture.  I had a nice clean list that took me days to put together, revise, and organize.  Now it is a mess of short notes, cross outs, circles, and check marks.  I am past the hopeful part, past the worried part, and now I just want to be done with applications so that I can wait without thinking that I could have/should have done something different.

2 people like this post.

Algorithmic Thinking

Wednesday, November 17th, 2010

posted by: mcrocker

I have heard the term “algorithmic thinking” tossed around a lot since joining NDeRC.  My teaching at Trinity involves teaching computer programming to Juniors in high school.  Most of these students have never done any programming before.  The final goal of teaching computer programming at Trinity is to give the students the skills needed to create tools in MatLab that can be used to “simulate” kinematics, kinetics, and other physics topics taught at Trinity so the students can try more experiments than are possible in a high school laboratory.  (It also has a very beneficial side purpose, which is to expose students to computer programming early, which is a good idea in general.)

In the course of teaching, I wanted to understand how well I was doing.  I assumed that this idea of algorithmic thinking was the best way to evaluate how well the students were learning.  However, it seems that algorithmic thinking is not the biggest issue when teaching the basics of computer programming.  Most students are really good at breaking down the steps of a task.  This is similar to explaining how to do something to a child.  I found that students would be good at explaining tasks to a computer if the computer knew English.  So really, the students just need to understand the computer commands.  This is harder, since the things a computer can do easily is not the same as the things that a person can do easily.  Computers perform calculations very quickly without error, but computer cannot make assumptions or correct for inexact commands.

So I came to the conclusion that abstraction is the key thing to teach students.  Once the student learns how to make a generalized statement in a form that the computer can understand, everything else falls into place.  However, students often have a hard time figuring out how to use an index variable in a loop.  Even v(i)=v(i-1)+1 is hard for them to come up with.  Having the students create and use their own functions, which have different variable scope, is even harder.

It is possible that algorithmic thinking is much more important later on, especially when it comes to more complicated problems (for example, problems that have more general inputs of arbitrary size).  However, when first teaching programming to students, the main hurdles seem to be with:  (1) understanding what computer commands are likely to be available to them, and (2) understanding how to abstract or generalize commands to apply to any dataset.

Also, the skills that students had before starting programming seem to go away due to frustration.  For simple programs, students tended to be willing to step through the programs in their minds.  With larger programs, they tend to rely on previous examples, and miss obvious errors that they would easily catch if they went line-by-line through the code.  I do that myself, since code can get very long at times.  In the end, it seems there is a lot more for me to understand about how first-time programmers understand what they are doing.

Be the first to like.

NANO Classroom Visits

Saturday, November 6th, 2010

posted by: mcrocker

Over the past few weeks, the three graduate fellows in the NANO collaboration (myself included) have visited 3 different classrooms, providing exciting nano-related instruction to 9 different classes for hundreds of middle and high school students.  It has been an interesting experience, and we have continued to modify our activities and rearrange our presentations based on what has worked well in past classes.  There was a significant different between high school and middle school attention spans.  However, each class had a unique personality, and the makeup of the classroom has been a fun mishmash of student personalities.  In one classroom, it was the quiet, studious student that showed the most interest.  In another, it was the loud, raucous student that showed the most interest, and gave a final positive exclamation at the end of the class in support of our visit.  While there were some students who complained about how our activities and presentations were a waste of time, most of the students were eager to measure ping-pong balls … ahem … atoms.  While some students were just glad have a change of pace, many students did show enthusiasm for our visit, for discussions of college/graduate school, for live demonstrations of unusual microscopes, and for discussions of the implications and diversity of career paths connected to nanotechnology.  I am excited to do this again, and we will have ample opportunity over the next half year!

Be the first to like.

Survey for Students

Wednesday, September 29th, 2010

posted by: mcrocker

At the start of the semester, I gave a survey to the Juniors at Trinity.  In this survey were questions about algorithmic thinking, questions about previous computer programming experience, questions about friends or family members who are scientists, and the now popular “attitude towards scientist” word-selection question.  You know, the one where students are asked to circle 5 words that best describe how you view scientists.

I found that only very few of the students circled negative words about scientists.  Perhaps I did not choose a very good word set, or perhaps the students at Trinity already have a more realistic view of scientists (not the standard media-influenced image).  It is also entirely possible that I have a mistaken view of how the general public views scientists.

The students did very well on the algorithmic thinking part of the survey.  The questions were designed to see if the students could evaluate actions that occur in a certain order and give the correct final result.  I wonder if the questions were too easy, or if algorithmic thinking is not that difficult, or if the students were already good at that for some reason.  Now that we are teaching the students to create functions in Alice (and about to get into MatLab), I wonder if having questions about abstraction (which is what functions do) would be worth having on the next survey.

Be the first to like.

Uncertainty in Computing

Wednesday, September 22nd, 2010

posted by: mcrocker

Computers and machines (that are well designed and debugged!) are generally much more reliable than humans for completing many simple but repetitive tasks.  The cotton gin changed cotton production dramatically.  Calculators replaced slide rules as they became more powerful and cheaper.  Computers can be programmed to automate any task that can be interfaced to electronic circuits.

As the smallest transistors continue to become smaller, the reliability of computer circuits has come under fire from the possibility of “high” defect and soft fault rates.  In integrated circuit manufacturing (making computer chips), the strategy to handle fabrication defects is to simply throw away any chip that has one or more defects on it.  That works just fine if the number of places that a defect can occur is small and the defect rate is small.  The Intel 4004 microprocessor had between 2,000 and 3,000 transistors.  The newest micro(nano?)processors have 1 billion (or more) transistors.  Defect rates have to be really, really small — much better than 1 in 1 billion — for yields to be high enough to be worth it.  This defect issue has been an issue for many years now.  Programmable logic became a big deal in the early 1980s, in part, because of concerns about defects.  There is another issue, although it mostly an issue for massively parallel supercomputers.  With thousands of computer processing units running for weeks at a time each one executing billions of instruction per second (100,000 processors * 1GHz * 60 sec * 60 min * 24 hours * 7 days = 6*10^19 instructions per week), you cannot rely on the entire parallel job running to completion without a single soft fault even with a low rate.  Some effort has to be taken to keep “snapshots” of the program execution to restart it if it fails in the middle of running.

As computers become more powerful and are used to do more amazing things, the small weaknesses get compounded and become glaring.  Each transistor requires less power, but there are more of them in a smaller space.  Now, personal computers overheat with ease (if the CPU fan were to fail), and the most powerful supercomputer in the world is almost one of the most likely to fail (if it is being used to full capacity without preventative measures).

Be the first to like.