Dr. Laura Lynn: Good afternoon, everyone. One of our responsibilities at the center is to enrich the activities that contribute to the quality and productivity of Walden University research. Research can be incredibly rewarding, but also isolating, experience. Today’s presentation gives one graduate the opportunity to share not only the results of her research, but also the story of her research journey. Having recently completed her dissertation research, Dr. Celeste Schwartz can reflect upon the doctoral research process firsthand. And being here today, you can take away insights into how you can actively enter scholar/practitioner discourse while creating opportunities for positive social change. Dr. Schwartz earned a Ph.D. in Education. Her dissertation, “The Impact of Faculty Learning Styles on Their Perceived Usefulness and Perceived Ease of Integrating Media Rich Content into Instruction” links teachers’ personal learning styles to their comfort with technology in the classroom. Dr. Schwartz’s findings may encourage teachers to step outside of their comfort zones and use technology in new ways in order to reach students who benefit from interactive learning. Further, these insights offer our professional development strategies to support initiatives to integrate more technologies and make pedagogical changes. Her dissertation committee chair is Dr. John Cooper, a faculty member in The Richard W. Riley College of Education and Leadership, and her methodologist is Dr. Daniel Salter, director of Strategic Research Initiatives for the Center for Research Support and also a faculty member in the Riley College of Education and Leadership. Dr. Schwartz’s comprehensive research reminds us all of what’s possible here at Walden. We have asked Dr. Reginald Taylor from the School of Psychology and the College of Social and Behavioral Sciences to moderate today’s discussion. Dr. Taylor is a curriculum developer, mentor, and talented researcher. And his questions will guide the discussion as Dr. Schwartz recounts the research experiences that brought her to where she is today. And now, it is my pleasure to welcome Dr. Daniel Salter, Dr. John Cooper, and Dr. Reggie Taylor to the stage. And Dr. Celeste Schwartz.
Dr. Celeste Schwartz: Thank you.
Dr. Daniel Salter: Thanks.
Dr. Lynn: Thanks. Thanks.
Dr. Reggie Taylor: Thank you, Dr. Lynn. Before we commence with the formal part of this, I just want to say how privileged and honored I feel to moderate this session. And specifically because I remember you walking around that maze in Lansdowne trying to find your way, and I was a brand-new professor trying to find my way also. And it’s the beauty to see you sitting here and to know that I facilitated some of your sessions, and I think we did some academic
advising, and I had a little, small piece in your success.
Dr. Schwartz: Thank you.
Dr. Taylor: So I’m very honored and privileged to be here with you.
Dr. Schwartz: Thank you.
Dr. Taylor: So welcome, everyone! What a pleasure to be here discussing your research with our community of scholarly practitioners. Let’s begin our conversation with you, Dr. Schwartz. Please give us a brief overview of your research.
Dr. Schwartz: Let me give you a little background on why I pursued the topic area that I pursued. As we all well know, there’s a lot of national attention today on student success and student completion. And we are seeing organizations such as the Gates Foundation supporting initiatives that seek to improve college readiness and college completion in the United States through the use of technology. We also see at our institutions that for
some reason we have faculty that embrace technology, and for some reasons we have faculty that do not embrace technology. And there’s very little in the research about the whys of that. So I was very interested in trying to look at technology and technology integration into the curriculum and discover one tiny piece through one small lens. And I decided with my research that I would look at something that really very few researchers have looked at as it relates to faculty. And that is learning styles. There’s a lot of research on student learning styles and student adoption of technology, but very little research on faculty learning styles and faculty adoption of technology. So as I crafted my research and my research design, obviously you have to make decisions. So the decisions first that you have to make are decisions on what—how are you going to gather your data? What are the appropriate instruments? And really what drives that is really—what is your research about? What are your research questions? So in looking at—and I’ll take these in pieces—first in looking at technology and—which is my, by the way, which is my career. I’m a vice president for technology at a community college. Looking at technology, there was a vast array of survey instruments that I could have used. The survey instrument and the theory that—first of all, let’s talk about the theory. The theory I decided to use was Davis’ Technology Acceptance Model. And there is a survey that Davis has developed that allows you to interchange, so to speak, the types of technologies that you want to look at. In my case, once again, as you develop your design, you really have to narrow it down and be focused. So I selected a technology that has a lot of promise with students with different learning styles. And that’s the integration of media-rich content into the curriculum. So on one hand, I had one instrument, which was the
Technology Acceptance Model. On the other hand, I needed to select an instrument for the learning styles. And there’s a vast array of instruments that I needed to explore and look at. And the instrument I ended up using was the Learning Style Inventory, which is Kolb’s Learning Style Inventory. And you may ask the “why?” part of that. Well, Kolb really looks at experiential learning. And when you think about technology and how we all learn technology, it’s by doing it. It’s not by sitting back and listening. You really have to touch it. So getting back to sort of honing in on this research, and giving you an overview of what occurred, basically, two instruments, two normed instruments, doing some research against the data. Developing my methodology, and then finally, and
I think we’ll get into more specific questions, and I’ll be able to go into some of that, but the important piece for me was that at the conclusion there were real findings. I had three research questions. The first two looked at the impact of—first one—perceived usefulness of technology, and that was really looking at it through the lens of the learning style. The second was the perceived ease of use. And finally—and this is something that you may find when you do your research—as you’re doing your literature review, what you’re looking for us what’s there, and what are the other scholars saying? Unfortunately, when I looked at my research on the Kolb model, what I found was that in the literature it was equal as far as the relationship between perceived usefulness and perceived ease of use of technologies. So in some cases researchers found them related. And in others, found them not related. For that reason, I decided to add a secondary component to my study, and I actually also looked at, for my study, how did that relate.
Dr. Taylor: Fantastic. You’ve mentioned a lot of literature, a lot of instruments, and there’s so much information out there, obviously, when we start on this project. And the key is to take all that information and condense it and come up with a doable project. Can you talk to us a bit about sifting through this mass amount of material and getting to something that’s doable?
Dr. Schwartz: So let me first confess that I … that was the part of my journey that I loved the most. And I think that John and Daniel know that I loved the most. I loved digging into and finding the research. The problem with that is you end up with masses of research. I probably had a couple hundred pieces of research. So this was my approach. My approach was first broad, and then deep. So what I mean by that is
I really had two topical areas. One was learning styles. The other was technology acceptance and, ultimately, integration of technology. The underlying pieces of that were the deep components. So I needed in each area—so first, I categorized in those two areas—then I created a laundry list based on the reading that I was doing through the literature review of what I will call subtopic areas. And then I started taking all of this research, and all these—all of my literature reviews and started putting them in subtopic areas. Now, I know some people—and you would think me being a technologist, that I would have used all those great tools out there that help you categorize and organize. So I tried two different tools. That didn’t work for me. So what worked for me was a lot of space on the floor. And I think my husband can attest to this. I would just print everything out, and I would make all kinds of notes, and then sort of shuffle them all over the floor and categorize them. It really is— and so I want to come back to its breadth/depth. When you get to the depth piece, you really need to start compartmentalizing everything into categories. I not only had a lot of this research hard-copy printed, I categorized those also on my computer just by creating file folders.
Dr. Taylor: Fantastic, fantastic. I can attest to all the materials on the floor. I actually had to expand to the walls also. So every available space will be used in your house. Dr. Cooper, how did you facilitate Dr. Schwartz research at its earliest stages?
Dr. John Cooper: You know, I have to say that in all honesty when Celeste did this, she came to me and I was very excited about the topic. I think her area, looking at technology and our use of it in education, is important. When she came to me and said, “I’m doing a quantitative study,” and I am a qualitative kind of guy, my heart raced. Now, Celeste also knows that she was the first. And I just sat down there and told her, I said, “I remember many times you said, ‘Gosh, I hate being the first.’” She’s the first of my mentees to go forward in this role.
Dr. Taylor: Yes, I overheard that conversation, yes.
Dr. Cooper: So as I early on in the stages facilitated this research, it was with deference to the process, and to this guy to my right, Dr. Salter, who did an outstanding job. And it was a pleasure to be a part of a committee with him. As we moved forward, I think I was very much a partner with Celeste in a number of ways. When we had the frequent conference calls with Daniel, I was another set of ears listening. We’d probably talked always before that call. So we understood what she was looking to get answers for. What help she needed. And as she sought to get those answers, I was listening and taking notes frantically along with her, so that we could do a little debriefing and move forward in that regard. So that was certainly one thing that I did as we started. And that’s the way I facilitated. I always made myself available. And I do really mean, and I stress that I felt like it was a partnership. Right through to the end, and even beyond. I spent a lot of time on the phone with Dr. Schwartz, and we got to know each other, and I felt like it was a little bit my research, but it wasn’t even close, because she did it, and she did it very well. One of the things I’d have to say, and if you listened to her just now, you can recognize that one of the blessings that I had in this process is that she represents a scholar-practitioner.
Dr. Taylor: Yes.
Dr. Cooper: In the true way that we read about. Some of you who’ve had the Foundations course, the scholar-practitioner course have read the golden work about being a generative person, a transformer, and a conserver. This scholar is all of those things, and she did her homework. So when she came to me and said, “Here’s what I want to do,” she didn’t just come. She had, as you pointed out, done a lot of examination of what those instruments were. I just recently deleted a couple of bookmarks that had referenced other instruments beside the Davis Technology Acceptance Model. She’d done her homework. So it made it much easier for me because of that. But that was how I facilitated. Being a partner, helping her, and moved through the process and work with it.
Dr. Taylor: Yes, I’m also amazed that looking at, because again, I remember you from Lansdowne and just the growth that I see now. And as I look at myself as a Walden student, I was a graduate from Walden, and the growth. And as I’ve told many of you, it's just going to be amazing where this place can and will take you. So just wanted to throw that in. Dr. Salter.
Dr. Salter: Sir?
Dr. Taylor: OK, you served as the methodologist for this study. What specific action or approach could students adopt from Dr. Schwartz’s process as it relates to choosing the appropriate methodology?
Dr. Salter: Well, that’s a really good question. Celeste definitely did her due diligence. And I think we heard earlier, and what I would say to anybody is to kind of fall in love with your variables. You know, you need to know them inside and out. What they’re about, how they’re measured, all that sort of stuff, because the statistics package assumes you know these things. But it’s also good for planning. I mean, we’ve talked a lot about the good stuff that’s been going on with this project, and there were some places where we had to kind of Plan B it. And knowing your variables inside and out makes all the difference. One of them—a couple of examples from Celeste’s study—one of them was age. She used age as a covariant. Made some sense. I mean, the scholarship supports age as a variable to understanding technology use. And we’re using it as a covariate. And for those of you who don’t know how covariates work, it’s kind of like clearing some of the fog away from it by using a known variable to clear the fog. But that didn’t pan out.
Dr. Schwartz: Nope.
Dr. Salter: And we looked at that, and we talked about it, and we, you know, when we really kind of looked at it, I think we understood why it didn’t work. The other one was she was talking about learning style, and the learning style didn’t quite go the way that we expected as well. But because she knew it and she knew it inside out, we could kind of dig a little deeper into what was going on with the study, and find that meaningful result. Because you know, when you’re doing the research, you know it’s there. I mean, when there’s a big pile of variants there, you know there’s something to talk about. It’s just a matter of figuring out what the story is. And I think that was a big take-away is she really got to know these variables well. And I think that’s the thing I would recommend as a single good strategy.
Dr. Taylor: Can you talk a little bit about, Dr. Schwartz, to your participants and how they were recruited. Because I always find that perhaps that was one of the most difficult pieces that we have this idea and this research, but we’ve got to find people, and we’ve got to get to the them, and we’ve got to get the amount that we need. Can you talk about the recruitment process?
Dr. Schwartz: Yes. But I think before I talk about the recruitment process, I just want to remind Daniel and John, when I decided who I was going to—who the participants of my study were going to be—I will never forget both of you saying, “Faculty?! You’re going to survey faculty?! Do you know how hard that is to do?” So yes, I surveyed community college faculty were my participants in my study from two mid-Atlantic community colleges. There were 300, and they were full-time teaching faculty only. There were 380 faculty. And I think to all of our pleasure and surprise, I had 165 participants, and, of those, 149 valid responses. Which was really pretty amazing based on, once again, as I did that literature review, I knew that John and Daniel were both sort of correct that lots of studies had to conclude with they didn’t quite have enough participants to really draw any type of findings or conclusions. So I was very lucky. So how did I go about this? First of all, I wanted to know a little bit about the institutions that I researched. And the reason for that is that when you think about the kind of study that I did, faculty development and a college’s or university’s breadth of technology would be very important. So I needed to use colleges that had the technology infrastructure and the technology training and support that was supporting good work for faculty to integrate technology. And so the two institutions that I selected, I knew both of those institutions, and I knew that they were both well-known for their support of faculty and integrating technology. So I think that helped a little bit, because there was sort of that culture at both institutions, that technology was supported by the administration and recognized as something of value. As far as how did I recruit them, the real nitty-gritty specifics, and then I want to mention something about a pilot study that I did before, basically got IRB approvals from the two institutions through their institutional research departments. I then got email addresses for all of the faculty that fit the criteria and sent an email. And I sent an email announcing my study. But I did not send an email asking if they wanted to participate. Because I looked at the research, through my literature reviews, and often times, especially when dealing with faculty, many of the studies indicated that they would get a lot of “no’s” when they would ask that initial question. So I knew I needed a consent, but I put my consent form within my packet. The other thing I learned from my literature review was be careful about using modern technology when doing surveys. I actually did a paper—a technologist—I did a paper and pencil survey packet. And I do believe that I got much better results than what I would have gotten from an electronic process. Because if faculty truly were apprehensive about using technology, my fear was I would not have included them in my study. So I think you have to be cautious about thinking through some of the newer ways that we can go about administering surveys. And whether it will fit or will not fit within your plan, and in helping you to get the survey results that you want.
Dr. Taylor: Yeah, what I’m hearing you say is I think it’s important also not just to review the literature around your variables and concerning your variables, but even looking at literature that addresses the recruitment process. And those characteristics and those things that may yield you a sample that’s too small for you to use. So, you’re researching a lot of things to help inform you and make those decisions, yes. Let’s talk a little bit about the data analysis process. Just briefly describe that.
Dr. Schwartz: I don’t know if I could briefly describe that, Daniel? What do you think, it ended up to be this big!
Dr. Salter: It’s either got to be a really brief conversation, or a really big one. So let’s start with brief.
Dr. Schwartz: Let’s try for brief, huh? OK, so the easy stuff. Obviously, the easiest piece of the analysis is doing the descriptive analysis. So I do a … did develop my own demographic survey instrument which was part of the survey packet. So that’s the easy stuff. You’re just putting in: Who are your participants? What do they look like? So age, the discipline that they were in. I asked a pretty interesting question that I couldn’t do very much with, but I did ask them if they had done any development in media-rich content and had integrated any media-rich content in the curriculum. So all of that I sort of described. So that was sort of the easy part. Then, the next step was I needed to really look at measurement reliability so I used Cronbach’s Alpha for my learning style cycle
modes. And basically there are four learning cycle modes. I won’t go into that, because of the time. And also for my technology. And that was, by the way, the independent variable. And then my dependent variable was my technology within the Technology Acceptance Model, there were two variables, perceived usefulness and perceived ease of use. All of that turned out fine. That was step one in this sort of diagnostic testing. The second analysis that I did was using Pearson Product-Moment Correlation. And this was really looking to see if there’s a relationship, and Daniel mentioned this earlier, between the covariant age and the dependent variable’s perceived usefulness and perceived ease of use. In that case, as Daniel indicated, we really needed to take a little step back based on the data that we found on the perceived usefulness piece, but really needed to explore more on the perceived ease of use piece, which caused more analysis to be done. And then, the third diagnostic test—and remember, I haven’t even started anything from my standpoint that’s real yet—the third diagnostic test was looking to see—I needed to be sure that there would be no correlation between your independent variable and that covariant age. So that’s the diagnostic. So I guess before I move on and tell you about the rest. I just want to emphasize that that’s the piece I was surprised by. I didn’t think I’d have to do diagnostic testing to figure out whether my data was OK or not. But that needed to be done. And then I got into the real part of my study and I used, first of all, I had, as I said, three research questions. On the perceived usefulness piece, I used ANOVA. On the perceived ease of use research question, I used an ANCOVA, and then reversed, and used an ANOVA, because when we looked at age further, it really added nothing to the research. It almost—in some ways it sort of clouded the research.
Dr. Taylor: Yes, I would argue, too, though, that that diagnostic piece was probably the most informative piece, and the most necessary piece to see exactly how that data fits what you’re trying to do. And it appears to me that it allowed you to make the appropriate adjustments.
Dr. Schwartz: There was no way to do it. For any of you, if you ever have Daniel on your committee, you know that you’re never going to get past the diagnostic piece until it’s perfect.
Dr. Taylor: Yeah, it’s vital.
Dr. Schwartz: So, and it is perfect. I can say that. But there was no—without that piece, I would have gone down a sort of a bumpy road as far as the age component, because I was totally convinced that age was going to make a difference.
Dr. Schwartz: And it did not. So in conclusion on the methodology piece—I’m sorry, on the analysis piece—when we went through the analysis on the perceived ease of use, what we found was that there was significance. But what you have to understand is that when you do those kinds of tests—and I had four different learning styles—I had no clue if one learning style over the other mattered. So I then, behind that research actually had another piece of analysis to do, I thought, one more piece of analysis. But it wasn’t one more piece, was it, Daniel? It was two more pieces. So the next piece of analysis that I needed to do was I needed to look for the specific differences for the perceived ease of use versus each of the four learning styles. And the data that came out of that was really very interesting. We did two different tests. We did a T-test, and we also looked at mean scores. And what happened when we looked at those two tests is that I went from four groups to two groups. The four different types of learning styles actually collapsed, because the mean scores were identical for two and identical for the other two. So that then meant that I needed to dig deeper, and I needed to really go beyond that and look all the way back to sort of where we started and where the data started us—was you start with learning modes, and then you sort of merge modes together and create your learning style, or determine what the learning style is. We went all the way back to the learning modes, and the conclusion—and then did more research against the learning modes—and the bottom line conclusion is pretty simple. I say it’s simple. These guys got very excited, but bottom line is for those faculty who have a learning mode of reflective observation, they are less likely to perceive technology easy to use. And that was really the conclusion of the research. But I think for the students, I think the most important piece to understand is that was a lot of, to me, a lot more analysis and research than what I thought we were going to end up doing, because as we went through and data would appear, there it was, different—I think we had to sort of like reverse course a few times with the analysis.
Dr. Taylor: OK, fantastic.
Dr. Cooper: Reggie, could I...
Dr. Taylor: Sure.
Dr. Cooper: ... pick up on Dr. Schwartz’s observations or her talk about analysis? Some of you may be sitting out there in the audience now shaking in your boots, and I’m sitting here listening to her being just awed by her work, and how impressed, and how complex it was. And with Daniel’s assistance, how much she dug deeply into this. Maybe not all of you are going to find that level of complexity in your research. And you don’t need to be intimidated by that. Because we have a lot of good scholars and methodologists like Daniel, who really put their heart and soul into helping her to get that completed. And I went along with the journey as the partner I said I was. And it was a very exciting journey. You can do it. And I would also just want to point out, Reggie, if just one more, allow me.
Dr. Taylor: Just briefly, please. Thank you.
Dr. Cooper: For those who might not be thinking about that kind of quantitative and complex research, you are no less off the hook. Because you need to think about your methodology, you need to think about those research questions, and you need to understand how those questions drive the methodology and dig a little deeper into those questions. It was just some of the most impressive part of the experience with Dr. Schwartz was the way she went at it.
Dr. Taylor: Great. I’ll close with a final question to you, Dr. Salter. We’ll let you close the show.
Dr. Salter: Oh, good.
Dr. Taylor: OK. What advice do you have for current and future doctoral candidates as they embark upon their research journeys?
Dr. Salter: On my note here I have two words, “best guess.” I think in Celeste’s case, it’s a textbook case of this, that your proposal is your best guess of what you’re going to be doing. You know, you’ve done your due diligence, you’ve covered all your basis, you’ve got the tightest design possible, but that doesn’t mean it will happen. And we had to Plan B it a couple of times, in spite of all of this. I mean, she had a good design and everything worked, but it wasn’t exactly what she set out to do. And that’s fine. I mean, when you get out in the real world with your dissertation, people are going to ask you, “What did you learn?” Not, “What did you verify?” OK? Or, “What did you substantiate?” “What did you learn?” And you learned a lot. It’s all that bits of serendipity. You know, John and I both had the same response to this reflective—you know, all the introverts out there, we don’t like technology, because we don’t want to get our hands on it yet. And I mean, it just made perfect sense to us. But that wasn’t there at the beginning. So you know, view the proposal as your best guess of what’s going to happen. And that way, you open yourself up to all these fun kind of findings that you might not have found. And it wasn’t as grueling as she made it out to sound.
Dr. Schwartz: Yes, it was.
Dr. Salter: It was worse!
Dr. Lynn: Thank you to our panel. I hope you have found this presentation to be both inspiring and informative. Research is the cornerstone of a graduate degree. In doctoral programs, you critically analyze, research and discuss the literature in our perspective fields; propose and conduct research to meet the requirements for earning a terminal degree; make research-based decisions to inform our roles as scholar-practitioners; and disseminate research findings to implement positive social change locally, nationally, and globally. That all sounds impressive. It is. You all will do, and are doing, impressive work. It’s not easy. We all know that. But you keep at it, and enjoy the process wherever you can. You have faculty, as you saw here, research support, and each other, to guide you when the going gets tough. Thank you for joining us today.