The BlueJ Experience:

Implementing Educational Innovation

Dianne Hagan
Michael Kolling
Selby Markham

Computing Education Research Group

Faculty of Information Technology
Monash University

 

Abstract

As a part of the development of the teaching of first year programming, the BlueJ programming language environment was designed to support the Java language. . The implementation was supported by a comprehensive study of student responses using a longitudinal research design. Student response to a novel educational innovation was basically positive and was supportive of the staff carrying out the innovation. Students rated the usefulness of BlueJ quite highly in spite of rating its implementation efficiency as quite low. There was no indication that students perceived BlueJ as a negative influence on their performance in the subject.

Introduction

Computer programming is taught in a wide variety of ways that is partly dependant upon the support tools available for a language. Standard compilers have provided a fundamental tool while interactive compilers such as Turbo Pascal have given students more direct access to task analyses. The evolution of the compilers into Graphic User Interface (GUI) tools (e.g. Delphi and Visual Basic) has enhanced the capacity of students to quickly introduce complex structures. It is with the programming environment tools such as Blue (Kolling, ) and BlueJ (Kolling, ) that a true programming language teaching support environment (PLTSE) has been created.

The BlueJ PLTSE has been designed to support the Java language and was introduced into the introductory teaching of that language in the School of Computer Science and Software Engineering (CSSE) in Semester 1 of the 1999 academic year. As this was a unique educational innovation, it was decided that a comprehensive evaluation was warranted to:

  1. Monitor student response
  2. Explore strengths and weaknesses
  3. Evaluate BlueJ as an educational tool

This paper reports on the overall patterns from the evaluation process.

Evaluation Design

The longitudinal design was implemented allowing for the collection of both formative and summative information. There were four main stages in the design:

  1. A collection of basic background data on students which included attitudes towards the course and experience with programming languages
  2. Collection of intermediate data during the semester
  3. On-line interviews for more in-depth data collection
  4. A final survey to explore the students' overall evaluation of the BlueJ experience.

From this design, we would be collecting a comprehensive data base which would allow for monitoring of student reactions over the whole of the semester. To facilitate this students were asked to identify themselves under the condition that the non-teaching member of the research team would be the only person to have access to identifiable data. The design was presented to the Monash University Ethics Committee and it was accepted subject to voluntary student participation.

All data collection tools can be viewed at :

http://www.sd.monash.edu.au/~smarkham/

 

Research Method

All students were asked to fill out a consent form either stating agreement to participate or that they did not wish to participate.

Each of the survey questionnaires were sequentially posted on a web page and this web page was then referenced from the subject Home Page. Tutors were also informed of the posting of a given questionnaire and were asked to direct student attention to it.

The background survey was posted in week 3, the first monitor survey in week 6, the second in week 9 while the final evaluation survey was posted in week 12.

In the case of the on-line interviews, samples of 8 students were randomly selected and e-mailed at 4 points during the semester. The form of these interviews can also be viewed at the previously quoted URL.

Data was automatically returned to the research directory and amalgamated into the primary data file.

 

Results

Demographic data

Of the 356 students enrolled in CSE1202, 120 agreed to participate in the study. This was a low acceptance rate but when the average characteristics of this group were compared with data collected for the equivalent subject in previous years, there was no reason to believe that it was a particularly biased sample.

On the first survey, there were respondents from this base sample of 120.

The biographic and demographic characteristics of the sample can be summarised as follows:

74% Male

31% first language not English

32% International full-fee paying student

5% had no access to a computer outside those available at the university

44% direct from high school & 32% from a diploma (most likely the os students)

Programming language experience

 

Language

Have Studied

Have experience

None

30

18

Pascal/Delphi

30

30

Basic/VB

22

24

C/C++

41

35

Java

4

4

HTML

16

45

CGI/ Perl

2

4

Other

15

17

(Entries are the counts for each category)

 

 

Student expectations of the teaching environment

Style of teaching you expect to experience

   

Lectures in large groups

19

Lectures in small groups

15

Tutorials

27

Computer laboratory sessions

33

Distance education

0

Do not know

6

 

Level of comfort with instruction methods (7= very comfortable)

 

Mean

S.D.

Lectures

5.18

1.18

Tutorials where most of the time is spent being given answers to problems

5.02

1.27

Tutorials or seminars where I am expected to participate and make a contribution

4.89

1.26

Problem solving groups where I have to work in a small team

5.04

1.4

 

 

The First Monitor Evaluation

Q 1.2 Current status of BlueJ installation

 

% response

I installed it and it works

25

I took me some time, but I installed it

38

I tried to install it, but I couldn't get it to run

28

I have a computer, but I did not try to install it

8

I don't have a computer at home

2

 

 

The mean response for Q 1.4 ( Understanding the difference between the Java language and the BlueJ development environment has been) was 4.20 with a standard deviation 1.7. This indicates that the average student found BlueJ as neither hard nor easy. When we look at the actual distribution we can see that there is a tendency for there is a trend to the 4/5 points but that there is also the likelihood that there are slight lumps at both the bottom and top ends.

 

 

 

Understanding the difference between Java language and BlueJ

Scale point

N

%

1 Very difficult

4

10

2

3

7

3

5

12

4

11

27

5

10

24

6

3

7

7 Very Easy

5

12

As we shall see from both the positive and negative comments, this pattern would fit with the variation in sophistication of those comments.

Of added note here is that there were 12 respondents who checked the 'Don't Know' box and this is consistent with the fact that a number of students had yet to install BlueJ or to fully utilise it.

Question 1.5 (Ease of using BlueJ) gave a mean of 3.59 and a standard deviation 1.56 which is similar to the previous question but the distribution shows an interesting pattern:

Ease of using BlueJ

   

Scale point

N

%

1 Very easy

4

 

2

11

 

3

9

 

4

14

 

5

6

 

6

5

 

7 Very difficult

2

 

Friendliness of user interface

Mean 4.25 Standard Deviation 1.582

Friendliness of user interface

   

Scale point

N

%

1 Very unfriendly

5

10

2

2

4

3

7

14

4

12

24

5

13

25

6

10

20

7 Very friendly

2

4

 

The way it helps me learn Java Programming

Mean 4.42 Standard Deviation 1.674

The way it helps me learn Java Programming

Scale point

N

%

1 Very little

4

8

2

4

8

3

4

8

4

11

22

5

15

30

6

6

12

7 A great deal

6

12

These four questions were correlated to determine whether there was a consistent pattern of ratings of these performance aspects of BlueJ. The only two questions to be significantly correlated were 1.7 and 1.8 (interface and help in learning) with a coefficient of 0.58 (p<0.05). Students appear to link the interface with their learning of Java but neither of these has relates to the other two questions.

There were 46 useable comments from the 54 responses to the first monitor survey. The aim of this survey was to establish the key factors determining the students' perceptions of the way BlueJ functioned. To this end they were asked to write in their major positive and negative experiences with using the software. The reporting of their positive and negative experiences ranged from a few words to long discussions of the way BlueJ functioned in comparison with other programming software.

Positive responses

Q: 1.3 What has been the best thing about using BlueJ? (Think about the time when you thought "This is great")

The positive responses fitted into two main categories:

Comments on the value of BlueJ within the OOP context - 9

The design and implementation of the graphical interface - 11

Another group of comments covered the general "feel" of the program and its interface for those who had not used many other packages or who had not used Java. A further group talked about the value they saw as a tool to help them get through their assignments.

Examples showing the breadth of the comments would be:

"I think the BlueJ is a cool programming environment."

"Saves code automatically!!!"

" .. I think you call it the workbench area. Additionally, being able to compile the objects and it does not allow to proceed, reporting any syntax errors. For a starter it a good tool for learning the concepts of programming / Java ."

".. when I started doing my assignment and finally figured out some, far from all, the functions of Bluej. The windows type environment lends a type of familiarity and comfort in working with it. Once you actually know what its doing, its really not all that bad."

"Although it reacts sometimes quite slow on my 166MMX laptop computer, but it is quite easy to use. However, the first time I use it, I really panic, because there is a big difference between BlueJ and the other programming development application, like Visual C++. One surprise to me is that, BlueJ will automatically open a very beautiful and very well-structure window for the user to input something, without a "READ" command or other related statement in other programming languages. "

"There hasn't actually been a time when I said 'this is great'. But I think that it simplifies the whole concept of programming With Java. Makes us concentrate on the main parts of programming And the whole 'object oriented programming' issue is a bit more easily understood."

The range and level of both positive and negative comments provided the BlueJ development team with essential information on how the program could be enhanced.

 

Critical Responses

Q 1.4 What has been the worst thing about using BlueJ? (Think about a time when you thought "This is terrible")

The critical responses were, as might be expected, more specific than the positive ones. Many also detailed various problems and failings. For example:

"where do I start? the error messages are not very helpful in the compilerthe speed of BlueJ in generalthe download time for installing"

Some recognised that they were using a new piece of software and that it would be expected to have some problems:

"When I had a bit of trouble with it as some strange messages appeared but it was ok later on. I guess it is just because it's a new program."

Others offered a small-scale consultants report:

"But there are some minor bugs that needs to fixed. ie: If you are hoping to market this product for Window 95/98 platforms.

The on-line help is the worst I seen in my life. It is very easy to create on-line help within the BlueJ environment itself, without having to connect to the Internet. In you case, If you don't have connection to the Internet you don't have help for BlueJ... funny isn't it :)

**** And the standard CUT, COPY, PASTE editing function require a mouse click on the buttons.

Well I've a lot of program editor and text editors in OS such as Lynux, DOS, UNIX, Windows 3.11, Windows 95, VSE/SP (main frame) but this is the worst editor I've used. If there is no, mouse you cannot cut or copy or use any other functions in the editor."

 

The Second Monitor Evaluation

A total of 65 students responded to the request to fill out the second monitor evaluation form.

 

Table xx Understanding the difference between Java language and BlueJ

 

Current

Previous

Scale point

N

%

 

%

1 Very difficult

0

0

 

10

2

0

0

 

7

3

9

14

 

12

4

9

14

 

27

5

13

21

 

24

6

9

15

 

7

7 Very Easy

22

36

 

12

Table xx gives the results for the question " Understanding the difference between the Java language and the BlueJ development environment has been" for both the current monitor and the previous one. It can be seen that students have shifted markedly in their perception on this issue.

Ease of using BlueJ

 

Current

Previous

Scale point

N

%

 

%

1 Very easy

10

16

 

8

2

9

15

 

22

3

14

23

 

18

4

10

16

 

27

5

8

13

 

12

6

7

11

 

9

7 Very difficult

4

7

 

4

 

Friendliness of user interface

       
 

Current

Previous

Scale point

N

%

 

%

1 Very unfriendly

4

7

 

10

2

7

11

 

4

3

7

11

 

14

4

14

23

 

24

5

16

26

 

25

6

11

18

 

20

7 Very friendly

3

5

 

4

It can be seen that there is little change in student perceptions of the friendliness of the BlueJ user interface between the first and second monitor.

The way it helps me learn Java Programming

 

Current

Previous

Scale point

N

%

 

%

1 Very little

5

8

 

8

2

4

6

 

8

3

5

8

 

8

4

10

16

 

22

5

16

25

 

30

6

15

24

 

12

7 A great deal

8

13

 

12

The high mean rating in the first monitor survey made it a little difficult for BlueJ to make an startling improvement by the time of the second monitor. But there is a definite shift.

How would you say your attitude to BlueJ has changed over the past few weeks.

 

N

%

No Change

15

24

More positive about BlueJ

27

43

More negative

12

19

Not sure

9

14

The trend in this overall rating by students is consistent with the other rating shifts between the first and second monitor surveys.

 

BlueJ and Coping

Correlations between the BlueJ ratings and the two coping questions in this monitor survey show that the educative elements of BlueJ relate positively to rated level of coping with the subject.

The negative correlations of "Ease of Using" on "BlueJ help with Java" and "Coping with subject" reflect the direction of the rating scales.

 

Intercorrelations between items from second monitor survey

 

BlueJ and teaching Java

Ease of using

Interface

BlueJ help with Java

Coping with subject

Workload in subject

BlueJ and teaching Java

1.00

         

Ease of using

-0.27

1.00

       

Interface

0.13

0.07

1.00

     

BlueJ help with Java

0.37

-0.32

0.32

1.00

   

Coping with subject

0.67

-0.31

0.17

0.48

1.00

 

Workload in subject

0.29

0.13

0.14

0.20

0.41

1.00

 

The Final General Evaluation

The final evaluation looked at both BlueJ and the overall response to the subject.

 

Table x.x BlueJ item characteristics

 

Mean

SD

2.1 Overall rating

4.2

1.6

2.2 Interface

4.5

1.5

2.3 BlueJ and teaching Java

4.3

1.7

2.4 Stability of BlueJ

2.7

1.3

2.5 Down-time

2.9

1.4

2.6 BlueJ help with Java

4.3

1.8

It can be seen from Table x.x that this is a great deal of difference between the ratings of BlueJ's software performance and its role in the teaching environment. Only seven students rated BlueJ's stability at 6 or 7 while only four rated its stability at that level.

 

The intercorrelations between the 6 overall evaluation questions for BlueJ are presented in Table x. They show that students gave consistent ratings to BlueJ and its interface as well as the extent to which BlueJ helped them learn Java.

 

Overal

Interface

Teaching

Stability

Down-time

Java

Overall rating

1.00

         

Interface

0.72

1.00

       

BlueJ/ teaching

0.72

0.71

1.00

     

Stability

0.61

0.55

0.45

1.00

   

Down-time

0.51

0.47

0.45

0.59

1.00

 

BlueJ/Java

0.67

0.53

0.69

0.38

0.24

1.00

Coefficient

The BlueJ ratings had no relationship to the overall ratings of the subject or course. Neither did they impact upon likelihood to recommend either the course or Monash.

Table x.x Correlations between BlueJ ratings and Satisfaction and Recommendation

 

Overal

Interface

Teaching

Stability

Down-time

Java

Satisfaction with subject

0.12

0.16

0.12

0.03

0.02

0.15

Satisfaction with course

0.04

0.06

0.17

0.21

0.22

0.10

Satisfaction with Monash

0.10

0.05

0.20

0.04

0.22

0.23

Recommend Course

0.01

0.07

0.04

-0.05

0.00

0.08

Recommend Monash

0.05

0.07

0.18

-0.04

0.01

0.20

 

The BlueJ ratings on 'overall', 'interface' and 'BlueJ/Java' were positively correlated with 5 questions:

Lab exercises

Lecture Notes

Computer Lab Exercises

Kept up

Confident about the exams

This pattern suggests what might be called a positive learning pattern, where high ratings of BlueJ correspond to positive reactions to the key teaching elements and with the individual study process.

These three aspects of BlueJ were negatively correlated with the three questions (5.3-5.5) about the pace and difficulty of the subject. This result has to be seen in terms of the direction of the rating scales. The negative correlations mean that those students who were positive about BlueJ were more likely to be happy with the Pace/Difficulty of CSE1202.

BlueJ did not significantly impact upon the sense of confidence that students had in completing the subject or year of the course. It also had zero-order correlations with perceived confidence in doing CSE1203, the subject the students would be doing in the next semester which also used BlueJ.

 

Summary and Conclusions

The BlueJ experience for the CSE1202 students who participated in the research was obviously very mixed. What was important was that those who participated were willing to be helpful and responsive to the requests for information. The comments collected from the two monitor phases were important in the continuing development of BlueJ to allow it to function effectively as a learning support tool.

The fact that BlueJ was not seen to have any negative effects on the overall performance in the course was an interesting result, given the difficulties students reported in installing and running it, particularly during the first part of the semester.

This research project will continue as BlueJ is included in the Java teaching in the next semester with the same cohort of students.