Group Update as of October 1, 2014

Well, it has been quite a while since last blogging. After a cool summer and a hectic September, it is the right time to briefly summarize what the RILCIE research group was doing.

First thing first, Dr. Arnold Yuan was appointed in May to be the founding Director of Ryerson Institute for Infrastructure Innovation, or RIII (r-triple-eye). The position officially started on July 1st, and it is a three-year term. Immediately he found himself much busier than before, consulting deans and colleagues, attending meetings and training workshops, making and receiving calls, and more recently receiving a couple of interviews. Consuming him a lot of time, this job turned out to be very exciting and challenging too.

RIII is a Faculty of Engineering and Architectural Science (FEAS)-based research institute striving to provide a holistic solution to urban infrastructure issues. RIII promotes interdisciplinary, applied research on infrastructure-related issues, both hard (engineering and technology) and soft (management and policy). RIII seeks collaboration with industry, public agencies and municipalities. The first research project has been awarded and sponsored by PPP Canada, a Crown-owned company overseeing federal funds for public-private partnership infrastructure projects. We are developing a website for RIII. I hope we shall release it to the public very soon.

In June Arnold did something he had never thought he would do — he served as a Bedel for the June Convocation! If you do not know what a Bedel is, you are definitely not alone. Arnold agreed to do this service because he did not know what the full responsibility would be when he was asked to do so. Life is full of uncertainty, as Arnold always says. Arnold is not only a risk guy, but also a risky guy!

If you want to check his performance in the convocation, click here and drag the progress bar to 47:55. Ideally the innovation should run exactly one minute — Arnold practiced at home for a couple of hours and figured this out. On stage, he ran a little faster, but it sounds okay.

In September Arnold also attended two external events, both related to infrastructure. On September 4 he attended the OCSI Infrastructure Forum. OCSI stands for Ontario Coalition of Sustainable Infrastructure. The forum set up roundtable discussions about two fundamental issues: What is sustainable funding for infrastructure? And what is affordable level of service?  In the end the group identified a number of burning issues to address the public infrastructure issues. On the top of the list were:

  • The public’s appreciation of infrastructure crisis
  • Comprehensive planning tools
  • Clear and transparent decision making criteria
  • Governments’ zero tolerance on innovation

From September 16 to 17 Arnold attended the 8th Infrastructure Summit on the Future of Canada’s Infrastructure. As one of the two chairs of the Summit, Arnold made an opening speech for the second day in the early morning. He first did a little pitch on the need for research and innovation and then introduced Ryerson’s efforts under RIII.

Back to school, RILCIE group welcomed two new members: Jing (Aviva) Zhang, PhD program; and Md. Abdus Salam, MEng program. A group meeting was held on September 22. Each of the graduate students made a presentation on their research projects.

Posted in News, Public Infrastructure | Tagged , , | Leave a comment

How should you prepare for exams?

About six years ago, when I just first came to Ryerson, I observed one thing: too many Ryerson students were interested in only marks, not in knowledge and problem-solving skills.  Unfortunately, I didn’t see this has been changed much.  This has made me very uneasy whenever I stop to think of you folks’ future career.

I still wish to give you the last pitch on how to learn during the final exam period.  Raised in China where graders have learned to manage five or six exams in two or three consecutive days, I had a very simple trick for exam preparation.  I already shared this trick to a student coming to my office today.  That is, ……………… LYING ON YOUR BED!!!

I knew—-many of you are busy with preparing your cheat sheet.  For those, I would say, poor you!

Lying on your bed, ask yourself what have you learned and what you think you should have learned or you are expected to know.  For the last question, read the course outline again.  I bet many of you never know the importance of the course outline.  sigh…….!

If you understand the calendar description of the course, you already know the forest.  Now you need to walk through the 12 lectures, in your mind, one by one.  What are the key concepts, methods, techniques, …. Those are the big trunks you should know. Try your best to forget the details (the bushes and grasses).  The details lead you nowhere.

After this mind exercise, you will have known what you have known and what you need to read/practice a bit more.  Then you get up and study or ask.  If you would be able to do this lie-down-get-up exercise three times, you will excel, not only in the exam, but also in any jobs in future.

Of course, this method will apply for those who have regularly attended my lectures.  For those infrequent attender, I am sorry I cannot really help, although I wish I could.

Stay put and cool!

Posted in Personal Life | Leave a comment

(Risk, Reliability) vs. (Resilience, Sustainability)

A recent in-print in the Journal of Infrastructure Systems, ASCE, is entitled “Resilience and sustainability of the civil infrastructure: towards a unified approach” by two Lehigh professors (P. Bocchini and D. M. Frangopol) and two German researchers (T. Ummenhofer and his student T. Zinke).  The paper, in my opinion, made a major contribution to the research community of RILCIE.  I strongly recommend you to read this paper.

It is interesting that the duality between resilience and sustainability is very similar to that between risk and reliability.  I didn’t see they pointed out this.

Posted in Personal Life | Leave a comment

Group Updates as of June 25, 2013

(1) Dr Arnold Yuan was tenured and promoted to Associate Professor in May 2013.

(2) Mr. Afzal Waseem successfully defended in early May 2013 his MASc thesis titled by Methodology Development and Local Calibration of MEPDG Permanent Deformation Models for Ontario’s Flexible Pavements.  Mr. Waseem was nominated by the examination Committee for departmental candidate for Governor’s Gold Medal for his high CGPA and publication records.

(3) Several new graduate students will join the RILCIE research team at Ryerson.  As of the update time, one PhD student from Nigeria, one MASc student from Bangladesh, two FT MEng and one PT MEng have confirmed their acceptance of the offers.  Another MASc student is waiting for his acceptance of offer.  Another MEng student will transfer to MASc in Fall 2013 as well.

(4) Dr. Arnold Yuan will be traveling overseas from June 25 to August 6.  During this period, he will conduct collaborative research at Hunan University in the field of structural reliability and infrastructure management.  In Mid July he will attend the 8th International Conference on Road and Airfield Pavement Technology (ICPT) held in Taipei, Taiwan.  Dr. Yuan will present a research paper co-authored by Afzal Waseem in MEPDG local calibration.  In the end of July, Dr. Yuan will also attend the International Conference on Nuclear Engineering in Chengdu, China.

Posted in News | Leave a comment

Infrastructure Weekly Digest, 20130422

This weekly digest focuses on budget allocation for natural disaster risk management.

Budget allocation for natural disaster risk management mainly deals with allocation of disaster mitigation funds (which are usually public funds) to different disaster-prone regions for different mitigation strategies.  In general, the issue can be divided into the following levels:

  • Level 1: Funding level for natural disaster mitigation programs.  To answer this question, we need to understand how much are spent each year for public spending including all societal programs and capital investment,  and how much percentage should be devoted to natural disaster mitigation programs.   The output of this analysis becomes the input for the next levels of analysis.  This issue is more of a type of political/policy analysis.
  • Level 2: Funding formula for different geographical regions.  To answer this question, we need the inventory of public infrastructure that are vulnerable to natural disasters.  We also need to perform probabilistic hazard analysis to understand the overall risk level at different regions.  In a more detailed manner, this level may also deal with prioritization of systems, structures and components (SSCs) for rehabilitation.
  • Level 3: Budget allocation for different mitigation strategies.  The mitigation strategies considered include preventive retrofitting, post-event rehabilitation and reconstruction, and disaster emergency management.  To answer this question, we need to understand cost effectiveness of pre-event retrofitting, disaster emergency management plan, and post-event loss estimation.
  • Level 4: Selection of optimal rehabilitation strategies.  This is a typical engineering design decision making issue.

The March issue of ASCE Journal of Infrastructure Systems published a paper titled by ‘Optimization-based regional hurricane mitigation planning’ by Legg et al.   This is continuation of the work on resource allocation for natural disaster risk management done by the research group led by Davidson and Nozick.  The proposed linear programming model focuses on Level 2 and Level 3 questions and considers mainly the following three types of disaster mitigation alternatives:

  • pre-vent retrofitting
  • post-event retrofitting or reconstruction, and
  • post-event acquisition and demolition

The objective of the LP model is to minimize the total expected expenditures subject to pre- and post-event budget constraints.  The model divides buildings into many categories depending on their vulnerability to hurricane, a study region into many census tracts depending on their hurricane susceptibility, and a building resistance into several levels.  In order to assess the building damage due to hurricane events, they adopt a detailed, component-based damage assessment approach.

Given the complexity of the proposed model, a heuristic solution procedure is employed.  For the case study that they carried out for eastern North Carolina, it was reported that the model contains more than 25 billion variables.

Several critics are given below:

(1) Resource allocation for natural disaster risk management is by nature a stochastic programming issue.  The authors used expected number of buildings at different damage states as the state variable of the model.  This is nothing wrong.  However, the formulation becomes problematic when it represents the budget constraints (both pre- and post-event budget) in terms of the expected costs (Eqn. (5) and (6)) at page 3).

(2) Although the separation of pre- and post-event retrofitting seems intuitive, it may not be necessary particularly when the frequency of hurricane is relatively high, say once every a couple of years.

(3) Time discretization makes the modeling very clumsy.  The paper does not show the time trend of the results; however, given the Poisson assumption that has to be made for the disaster events, it is reasonable to perceive that a equilibrium status should be achieved should the annual budgets are constant.

 

Posted in Budget Allocation | Leave a comment

The Risk of Performing Risk Assessment

Risk analysis aims to provide clients, sometimes the public, with an objective view of the risks involved in a human endeavor or induced by some natural disaster.   In my Risk & Reliability for Engineers class, I already heard one student stating: “Risk analysts never make mistake, because they will never say something bad will NOT happen; they would just say the probability is small.”

Yes.  Or not!

Recently six seismologists and one public servants in Italy were charged for manslaughter because of their ‘inadequate’ risk assessment.  For more details, read here.

Posted in Uncategorized | Leave a comment

Research Assistant Positions

For Prospective Students:

I am currently looking for hiring an international PhD student to do research in risk-informed life-cycle infrastructure management.  Students who have strong background in civil engineering, industrial engineering, nuclear engineering, applied mathematics, or engineering management are welcome to apply.  Ideally, the PhD Candidate in this area should have solid background in the following subjects:

  • Probability and Statistics
  • Engineering Risk and Reliability
  • Operations Research, or Engineering Optimization; and
  • One of the following subject matters: structural analysis, fatigue and fracture, pavement design, or nuclear power plant systems

Competitive research stipend will be provided to dedicated students upon admission.

 

For Current Ryerson Graduates:

I am currently looking for a research assistant from the first-year M.Eng. students to do research on pavement design and management.  The candidate will have the potential to be transferred to MASc with competitive research stipend if she or he satisfies the minimum criteria of the transferal upon the completion of course work in May 2013.

I am also looking for an M.Eng. student (either year) to do a research project in condition assessment and performance evaluation for sewer systems.

If you are interested in any of these positions, please contact me or drop by my office at MON305 to discuss the details.

Posted in News | Tagged , | Leave a comment

Statistics 101: Reject or Accept the Hypotheses

Hypothesis testing relies on using information in a random sample from a population of interest to make inference about a certain characteristic of the population.  It involves formulating hypotheses (both null and alternative hypotheses), taking a random sample, computing a test statistic from the sample data, and then using the test statistic to make a decision about the null hypothesis. If the information (expressed in the test statistic) is consistent with the null hypothesis, we will not reject the hypothesis; however, if the information is inconsistent with the null hypothesis, we will conclude that the hypothesis is false.

For many beginners in statistics, the above highlighted statement is confusing, and to some, annoying, because it talks about rejection only and never says when we should accept a hypothesis.  Why professors and researchers are so stingy with wholehearted acceptance?

Well, this is not because of peculiarities of those professionals, but because of the nature of scientific research.  To explain, let’s look at a simple example.  Suppose we have collected 9 random samples of compressive strength of a given mix of concrete. From the data we calculate the sample mean and sample standard deviation to be, say 49.2 MPa and 6.9 MPa, respectively.  Suppose also that the design spec states that the compressive strength should not be lower than, say 40 MPa.  (Note that the language used in the design spec is a little sloppy here.  What does the compressive strength really refer to?  A mean value, a 5% percentile, or something else?  Let us assume for now that it refers to the mean value of the compressive strength being no less than 40 MPa. If it refers to some percentile, then actually we would need to use the concept of tolerance interval for the discussion).  Then, we can formulate the hypotheses as

  • NH: \mu = 40 MPa
  • AH: \mu < 40 MPa

where the Greek \mu refers to the population mean of the compressive strength.

To proceed with the hypothesis test, we speculate in this way: What we have had is only some sample data and their statistics.  If we can establish how probable that this sample would occur if the population mean is 40 MPa indeed (i.e., the NH is true), then we can make a statement about the truth of the null hypothesis.

For this example, it can be shown that given a population mean of 40 MPa, the  probability of observing a sample more bizarre than the observed one is only 0.2%, suggesting very low probability that the population mean would be 40 MPa.  Hence we reject the NH.  (The details is not the focus for this blog.  But the t-statistic can be used for this simple case.  Using the results, we yield t = (49.2-40)/(6.9/sqrt(9)) = 4).

Do we accept the AH then?  Not really.  For a population with mean smaller than 40 MPa, we would expect the sample mean even smaller.  Therefore, in this specific case, we will reject both NH and AH.

“But what’s the point of the hypothesis testing?” you would ask immediately.

The point is: the sample testifies that the population mean is higher than 40 MPa.  It is very unlikely it will be 40 MPa or below.

Of course, if the AH is a two-sided version, i.e., \mu != 40 MPa, then the rejection of the NH will indicate an acceptance of the AH.

The other scenario is that of not rejecting the NH. In this case, can we accept the NH instead?

Suppose that a second sample yields a sample mean of 40 MPa exactly.  Then, whatever the standard deviation is, the t statistic is 0, and the p-value is 0.5 or 50%.  Therefore, there is no evidence that the NH (\mu = 40 MPa) should be rejected.

But can we accept that the population mean is 40 MPa indeed?

Not really, again! Why? Because if the population mean is 38 MPa instead, it is of significant possibility, as long as the population standard deviation is not too small, that the the sample mean can be as high as 40 MPa, the observed value.  This also holds true when the population mean is 42 MPa.

Logically, it make sense.  But practically, what can we do if the no rejection conclusion is made?  Usually, we will act as if the NH suggests a status quo, i.e., no action.

Posted in Statistics 101 | Leave a comment

Book Review: The Failure of Risk Management

http://syndetics.com/index.php?isbn=9780470387955/LC.JPG&client=ryera&type=hw7

Douglas W. Hubbard, The failure of risk management: why it’s broken and how to fix it. Hoboken, N.J. : Wiley, c2009.  Ryerson Library Call Number:  HD61 .H76 2009, 7th Floor.

In June 2012 I attended for the first time the annual conference of the Canadian Society of Civil Engineers.  A challenge during a conference is that there are always so many interesting presentations that you would not like to just sit for all the presentations in one room; rather, you may opt to shuttle around.  This conference was no exception for me: I jumped over different themes from structures and earthquake through transportation to construction and project management. One title that really attracted me went with something like “integration of value engineering and risk analysis”.   The presenter was from industry with a manager title as well as a PhD degree from a renowned university from western Canada.  Later I found out his PhD supervisor was a big name in the field of project risk management.

I had been very enthusiastic about this presentation because I wondered what one could really add onto this subject.  After all, according to Steve Holmes, senior engineer of the Ontario Ministry of Transportation and a influential proponent of value engineering in Canada, from whom I received the basic training in value analysis, risk analysis must be an essential element of any value analysis (VA).

Steve is really a dedicated person who would grasp whatever opportunity and means to convince people to study and employ VA.  Upon several telephone and email communications, I met Steve in person for the first time at the CSVA conference last October.  After the conference, I wrote to Steve that the VA was indeed a very structured approach to innovation. However, I also politely proposed that much research would be needed before the VA could get wider acceptance in planning, engineering design and project evaluation.  One of the areas that need immediate improvement is the risk assessment methods that the structured VA currently adopts.  I said this because I felt that the VA was just a repacked and much simplified tool that had been practiced for a long time in system engineering.  For example, there have been many proven tools for functional analysis, performance measurements and risk analysis in the field of system engineering.  Surprisingly, Steve gave me an emotional reply, seemingly suggesting me that there was nothing wrong in VA and that it was the university professors who should be blamed for this lack of acceptance of VA because we did not teach students VA in our curriculum.  I admitted to him that I did not know the details of value analysis in transportation engineering.  Steve kindly arranged me a VA101 course and I attended it in November.  After the one-day training workshop at Edward Garden in Toronto, my view on value analysis did not change.  Rather, I was more convinced that the scoring approach to risk analysis and the a little too arbitrary definition of ‘value’ were probably the main reasons that have prevented itself from wider acceptance by industry.

So I sat the CSCE conference presentation, wishing to see some sophisticated risk analysis method to be proposed for the value analysis.  To my disappointment, after a grandiose opening the presenter swiftly changed the routine scoring technique for evaluating the “likelihood” and  “consequence”.  No surprise, he also multiplied the two scores to obtain the number of risk.

Right after he turned to the ‘thank you’ slide, I already raised my hand. Without any rhetoric, “how can you multiply two ordinal numbers to obtain a cardinal one for comparison?” I asked.   After a little puzzle, “what do you mean?” the presenter replied.  I probably already lost my patience by that time as I did not recall that I explained my objection clearly.  However, I remember I told him and the audience that 2 by 2 is not necessarily 4 in your calculation.  A little frustrated by my comments, the presenter must have been thinking, I could tell, where’s this guy coming from?  To give himself some strength in his defense, he told me that professor ABC (the big name I mentioned in the beginning) also used the approach.  Alas, ‘the best practice’, so-called!

So when I’ve got Hubbard’s Failure of Risk Management, I cannot help completing it in one straight reading.  The author reveals all of the tricks that the ‘risk management professional’ used to sell the ‘snake oil’ (pp.71-74).   For example, when reading “Sell ‘structured’ approaches”, I thought, yes, the value analysis workshop was structured to take five days as if the whole facilitation procedure has been inscribed to stone.  In “Sell What Feels Right,” Mr Hubbard tells the cold joke that ‘if you call it a score, it will sound more like golf, and it will be more fun for them.’  Undoubtedly, Mr. Hubbard would smile if he read Steve’s response and the presenter’s reaction to my questions.  In the whole day of reading this book, I kept nodding and saying to myself, “that is exactly what I wanted to say but was not bold enough to speak out in many situations.’

The book starts with asking three basic questions: 1) do any of the popular risk management methods work? 2) would anyone in the organization even know if they didn’t work? 3) if they didn’t work, what would be the consequences?  The author states that the methods are fundamentally flawed and the risk managers suffer from or enjoy a placebo effect. The consequence? A common mode failure, or in fact, “a weak risk management approach is effectively the biggest risk in the organization (p.6).”  The reasons for this failure include

  • the failure to measure and validate methods as a whole or in part,
  • the use of components that are known not to work, and
  • the lack of use of components that are known to work.

Clearly, among the four key elements of risk management (identification, modeling and analysis, mitigation, and review), the modeling and analysis was the author’s focus of examination.

The author compared the trites of four major professionals using risk management: actuaries, war quants (or Operations Researcher), economists (financial engineers), and management consultants.  This comparison is much clearer and more conclusive than a recent publication by Samson et al. (2009), who after reviewing the many different views, added one more confusion to the community.

Mr Hubbard identified seven challenges for risk management:

  1. confusion regarding the concept of risk
  2. completely avoidable human errors in subjective judgments of risk
  3. entirely ineffectual but popular subjective scoring methods
  4. misconceptions that block the use of better, existing methods
  5. recurring errors in even the most sophisticated models
  6. institutional factors, and
  7. unproductive incentive structures

With engineering background, I was particularly satisfied with the author’s analysis of the flaws in the popular scoring methods.  I also admired very much his way of dismantling all sorts of accuse of sophisticated, quantitative methods of risk modeling.  The summary of Kahneman and Tversky’s work on judgment and biases is very concise.  I also agree with the author’s standing on subjectivism in interpretation of probability.  Of course, the common errors in quantitative models discussed in Chapter 9 were not new to me.  One area that I felt could be strengthened is the probability calibration for inputs in Monte Carlo simulation.

The thesis of the book is to advocate the use of scientific, quantitative risk modeling, rather than qualitative (scoring) approaches that were often touted as ‘best practices’, in risk management.  I encourage every student from my Risk & Reliability for Engineers class to read this book.

Posted in Book Review, Risk & Reliability, Uncertainty | Leave a comment

Research Activities in Summer 2012

Alas, time flies by and now it’s almost the end of the summer.  I heard many people complaining the long-lasting heat; the summer is definitely too short to me.

The main theme of the summer research was highway pavement design.  More specifically, it was about local calibration of the mechanistic-empirical pavement design guide (MEPDG) developed by the AASHTO under several NCHRP projects.  Two graduate students of mine and one high school students worked intensely in this project.  Gulfam Jannat completed her MASc study in April and continued her research as a Research Associate for two months.  Afzal Waseem transferred to MASc from MEng in May (officially in September).  The high school kid is really a computer genius.  He helped us solve several technical difficulties that had lingered for a long time.  Based on the research, we submitted two papers to the 2013 Transportation Research Board (TRB) conference.

In June, I attended for the first time the annual conference of the Canadian Society of Civil Engineers (CSCE) held in Edmonto, Alberta.  The University of Alberta hosted this year’s conference.  I presented two papers, one related to the MEPDG local calibration and the other to pavement performance modeling.  It looked that we have done the first comprehensive study on local calibration of MEPDG in Ontario.

Part of my research is on nuclear piping reliability and degradation modeling.  In July, I attended the annual conference on Pressure Vessels and Piping (PVP) sponsored by American Society of Mechanical Engineers (ASME).  This year the conference was held in Toronto.  I co-chaired one session for component reliability analysis.  Our paper discussed a few hard and fundamental issues in degradation modeling by using feeder wall thinning due to flow-accelerate corrosion as an example.  Our presentation attracted a large audience, if not the largest, among the technical sessions.

Due to some family reason, I gave up a chance of traveling to Japan for attending another conference on structural reliability in August.

Shuo Yu, a graduate student studying at the University of Waterloo under my co-supervision,  has made significant progress in common cause failure modeling and failure rate estimation.  Empirical Bayes method and the James-Stein method were the focus of his study.

Since August 1, Dr Zhao Zhao, a visiting scholar from China, started his one-year visiting research.  Dr Zhao’s main current research focus is on technological risk assessment.

Posted in News | Leave a comment