Is the development of robotics helpful or harmful

Assignment Help Other Subject
Reference no: EM131046202

Argumentative Synthesis Worksheet

Prompt: Some people believe that the use of robotics is advancing our society. Others contend that such developments are changing society negatively. What is your position on this issue? Is the development of robotics helpful or harmful?

1. A Swiveling Proxy That Will Even Wear a Tutu By ROBBIE BROWNJUNE 7, 2013

https://www.nytimes.com/2013/06/08/education/for-homebound-students-a-robot-proxy-in-the-classroom.html?_r=0

2. How One Boy With Autism Became BFF With Apple's Siri By JUDITH NEWMANOCT. 17, 2014

https://www.nytimes.com/2014/10/19/fashion/how-apples-siri-became-one-autistic-boys-bff.html

3. The Ethical Frontiers of Robotics by Noel Sharkey*

https://webpages.uncc.edu/~jmconrad/ECGR4161-2011-05/notes/Science_Article_Robotics_Ethics2.pdf

4. THE ROBOTIC MOMENT

Sherry turkle

In late November 2005, I took my daughter Rebecca, then fourteen, to the Darwin exhibition at the American Museum of Natural History in New York. From the moment you step into the museum and come face-to-face with a full-size dinosaur, you become part of a celebration of life on Earth, what Darwin called "endless forms most beautiful." Millions upon millions of now lifeless specimens represent nature's invention in every corner of the globe. There could be no better venue for documenting Darwin's life and thought and his theory of evolution by natural selection, the central truth that underpins contemporary biology. The exhibition aimed to please and, a bit defensively in these days of attacks on the theory of evolution, wanted to convince.

At the exhibit's entrance were two giant tortoises from the Galápagos Islands, the bestknown inhabitants of the archipelago where Darwin did his most famous investigations. The museum had been advertising these tortoises as wonders, curiosities, and marvels. Here, among the plastic models at the museum, was the life that Darwin saw more than a century and a half ago. One tortoise was hidden from view; the other rested in its cage, utterly still.

Rebecca inspected the visible tortoise thoughtfully for a while and then said matter-of-factly, "They could have used a robot." I was taken aback and asked what she meant. She said she thought it was a shame to bring the turtle all this way from its island home in the Pacific, when it was just going to sit there in the museum, motionless, doing nothing. Rebecca was both concerned for the imprisoned turtle and unmoved by its authenticity.

It was Thanksgiving weekend. The line was long, the crowd frozen in place. I began to talk with some of the other parents and children. My question-"Do you care that the turtle is alive?"-was a welcome diversion from the boredom of the wait. A ten-year-old girl told me that she would prefer a robot turtle because aliveness comes with aesthetic inconvenience: "Its water looks dirty. Gross." More usually, votes for the robots echoed my daughter's sentiment that in this setting, aliveness didn't seem worth the trouble. A twelve-year-old girl was adamant: "For what the turtles do, you didn't have to have the live ones." Her father looked at her, mystified: "But the point is that they are real. That's the whole point."

The Darwin exhibition put authenticity front and center: on display were the actual magnifying glass that Darwin used in his travels, the very notebook in which he wrote the famous sentences that first described his theory of evolution. Yet, in the children's reactions to the inert but alive Galápagos tortoise, the idea of the original had no place. What I heard in the museum reminded me of Rebecca's reaction as a seven-year-old during a boat ride in the postcard-blue Mediterranean. Already an expert in the world of simulated fish tanks, she saw something in the water, pointed to it excitedly, and said, "Look, Mommy, a jellyfish! It looks so realistic!" When I told this story to a vice president at the Disney Corporation, he said he was not surprised. When Animal Kingdom opened in Orlando, populated by "real"-that is, biological-animals, its first visitors complained that they were not as "realistic" as the animatronic creatures in other parts of Disneyworld. The robotic crocodiles slapped their tails and rolled their eyes-in sum, they displayed archetypal "crocodile" behavior. The biological crocodiles, like the Galápagos tortoises, pretty much kept to themselves.

I believe that in our culture of simulation, the notion of authenticity is for us what sex was for the Victorians-threat and obsession, taboo and fascination. I have lived with this idea for many years; yet, at the museum, I found the children's position strangely unsettling. For them, in this context, aliveness seemed to have no intrinsic value. Rather, it is useful only if needed for a specific purpose. Darwin's endless forms so beautiful were no longer sufficient unto themselves. I asked the children a further question: "If you put a robot instead of a living turtle in the exhibit, do you think people should be told that the turtle is not alive?" Not really, said many children. Data on aliveness can be shared on a "need-to-know basis"-for a purpose. But what are the purposes of living things?

Only a year later, I was shocked to be confronted with the idea that these purposes were more up for grabs than I had ever dreamed. I received a call from a Scientific American reporter to talk about robots and our future. During that conversation, he accused me of harboring sentiments that would put me squarely in the camp of those who have for so long stood in the way of marriage for homosexual couples. I was stunned, first because I harbor no such sentiments, but also because his accusation was prompted not by any objection I had made to the mating or marriage of people. The reporter was bothered because I had objected to the mating and marriage of people to robots.

The call had been prompted by a new book about robots by David Levy, a British-born entrepreneur and computer scientist. In 1968 Levy, an international chess master, famously wagered four artificial intelligence (AI) experts that no computer program would defeat him at the game in the subsequent decade. Levy won his bet. The sum was modest, 1,250 British pounds, but the AI community was chastened. They had overreached in their predictions for their young science. It would be another decade before Levy was bested in chess by a computer program, Deep Thought, an early version of the program that beat Gary Kasparov, the reigning chess champion in the 1990s.

3 These days, Levy is the chief executive officer at a company that develops "smart" toys for children. In 2009, Levy and his team won-and this for the second time-the prestigious Loebner Prize, widely regarded as the world championship for conversational software. In this contest, Levy's "chat bot" program was best at convincing people that they were talking to another person and not to a machine. Always impressed with Levy's inventiveness, I found myself underwhelmed by the message of this latest book, Love and Sex with Robots.

4 No tongue-in-cheek science fiction fantasy, it was reviewed without irony in the New York Times by a reporter who had just spent two weeks at the Massachusetts Institute of Technology (MIT) and wrote glowingly about its robotics culture as creating "new forms of life."

5 Love and Sex is earnest in its predictions about where people and robots will find themselves by mid-century: "Love with robots will be as normal as love with other humans, while the number of sexual acts and lovemaking positions commonly practiced between humans will be extended, as robots will teach more than is in all of the world's published sex manuals combined."

6 Levy argues that robots will teach us to be better friends and lovers because we will be able to practice on them. Beyond this, they will substitute where people fail. Levy proposes, among other things, the virtues of marriage to robots. He argues that robots are, of course, "other" but, in many ways, better. No cheating. No heartbreak. In Levy's argument, there is one simple criterion for judging the worth of robots in even the most intimate domains: Does being with a robot make you feel better? The master of today's computerspeak judges future robots by the impact of their behavior. And his next bet is that in a very few years, this is all we will care about as well.

I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity. Granting that an AI might develop its own origami of lovemaking positions, I am troubled by the idea of seeking intimacy with a machine that has no feelings, can have no feelings, and is really just a clever collection of "as if " performances, behaving as if it cared, as if it understood us. Authenticity, for me, follows from the ability to put oneself in the place of another, to relate to the other because of a shared store of human experiences: we are born, have families, and know loss and the reality of death.

7 A robot, however sophisticated, is patently out of this loop. So, I turned the pages of Levy's book with a cool eye. What if a robot is not a "form of life" but a kind of performance art? What if "relating" to robots makes us feel "good" or "better" simply because we feel more in control? Feeling good is no golden rule. One can feel good for bad reasons. What if a robot companion makes us feel good but leaves us somehow diminished? The virtue of Levy's bold position is that it forces reflection: What kinds of relationships with machines are possible, desirable, or ethical? What does it mean to love a robot? As I read Love and Sex, my feelings on these matters were clear. A love relationship involves coming to savor the surprises and the rough patches of looking at the world from another's point of view, shaped by history, biology, trauma, and joy. Computers and robots do not have these experiences to share. We look at mass media and worry about our culture being intellectually "dumbed down." Love and Sex seems to celebrate an emotional dumbing down, a willful turning away from the complexities of human partnerships-the inauthentic as a new aesthetic.

I was further discomforted as I read Love and Sex because Levy had interpreted my findings about the "holding power" of computers to argue his case. Indeed, Levy dedicated his book to Anthony,b an MIT computer hacker I interviewed in the early 1980s. Anthony was nineteen when I met him, a shy young man who found computers reassuring. He felt insecure in the world of people with its emotional risks and shades of gray. The activity and interactivity of computer programming gave Anthony-lonely, yet afraid of intimacy-the feeling that he was not alone.

8 In Love and Sex, Levy idealizes Anthony's accommodation and suggests that loving a robot would be a reasonable next step for people like him. I was sent an advance copy of the book, and Levy asked if I could get a copy to Anthony, thinking he would be flattered. I was less sure. I didn't remember Anthony as being at peace with his retreat to what he called "the machine world." I remembered him as wistful, feeling himself a spectator of the human world, like a kid with his nose to the window of a candy store. When we imagine robots as our future companions, we all put our noses to that same window.

I was deep in the irony of my unhappy Anthony as a role model for intimacy with robots when the Scientific American reporter called. I was not shy about my lack of enthusiasm for Levy's ideas and suggested that the very fact we were discussing marriage to robots at all was a comment on human disappointments-that in matters of love and sex, we must be failing each other. I did not see marriage to a machine as a welcome evolution in human relationships.

And so I was taken aback when the reporter suggested that I was no better than bigots who deny gays and lesbians the right to marry. I tried to explain that just because I didn't think people should marry machines didn't mean that any mix of adult people wasn't fair territory. He accused me of species chauvinism: Wasn't I withholding from robots their right to "realness"? Why was I presuming that a relationship with a robot lacked authenticity? For me, the story of computers and the evocation of life had come to a new place. At that point, I told the reporter that I, too, was taking notes on our conversation. The reporter's point of view was now data for my own work on our shifting cultural expectations of technology-data, that is, for the book you are reading. His analogizing of robots to gay men and women demonstrated that, for him, future intimacy with machines would not be a secondbest substitute for finding a person to love. More than this, the reporter was insisting that machines would bring their own special qualities to an intimate partnership that needed to be honored in its own right. In his eyes, the love, sex, and marriage robot was not merely "better than nothing," a substitute. Rather, a robot had become "better than something." The machine could be preferable-for any number of reasons-to what we currently experience in the messy, often frustrating, and always complex world of people. This episode with the Scientific American reporter shook me-perhaps in part because the magazine had been for me, since childhood, a gold standard in scientific publication. But the extravagance of the reporter's hopes for robots fell into a pattern I had been observing for nearly a decade. The encounter over Love and Sex most reminded me of another time, two years before, when I met a female graduate student at a large psychology conference in New Orleans; she had taken me aside to ask about the current state of research on robots designed to serve as human companions. At the conference, I had given a presentation on an-thropomorphism-on how we see robots as close to human if they do such things as make eye contact, track our motion, and gesture in a show of friendship. These appear to be "Darwinian buttons" that cause people to imagine that the robot is an "other," that there is, colloquially speaking, "somebody home."

During a session break, the graduate student, Anne, a lovely, raven-haired woman in her mid-twenties, wanted specifics. She confided that she would trade in her boyfriend "for a sophisticated Japanese robot" if the robot would produce what she called "caring behavior." She told me that she relied on a "feeling of civility in the house." She did not want to be alone. She said, "If the robot could provide the environment, I would be happy to help produce the illusion that there is somebody really with me." She was looking for a "no-risk relationship" that would stave off loneliness. A responsive robot, even one just exhibiting scripted behavior, seemed better to her than a demanding boyfriend. I asked her, gently, if she was joking. She told me she was not. An even more poignant encounter was with Miriam, a seventy-two-year-old woman living in a suburban Boston nursing home, a participant in one of my studies of robots and the elderly.

I meet Miriam in an office that has been set aside for my interviews. She is a slight figure in a teal blue silk blouse and slim black pants, her long gray hair parted down the middle and tied behind her head in a low bun. Although elegant and composed, she is sad. In part, this is because of her circumstances. For someone who was once among Boston's best-known interior designers, the nursing home is a stark and lonely place. But there is also something immediate: Miriam's son has recently broken off his relationship with her. He has a job and family on the West Coast, and when he visits, he and his mother quarrel-he feels she wants more from him than he can give. Now Miriam sits quietly, stroking Paro, a sociable robot in the shape of a baby harp seal. Paro, developed in Japan, has been advertised as the first "therapeutic robot" for its ostensibly positive effects on the ill, elderly, and emotionally troubled. Paro can make eye contact by sensing the direction of a human voice, is sensitive to touch, and has a small working English vocabulary for "understanding" its users (the robot's Japanese vocabulary is larger); most importantly, it has "states of mind" affected by how it is treated. For example, it can sense whether it is being stroked gently or with aggression. Now, with Paro, Miriam is lost in her reverie, patting down the robot's soft fur with care. On this day, she is particularly depressed and believes that the robot is depressed as well. She turns to Paro, strokes him again, and says, "Yes, you're sad, aren't you? It's tough out there. Yes, it's hard." Miriam's tender touch triggers a warm response in Paro: it turns its head toward her and purrs approvingly. Encouraged, Miriam shows yet more affection for the little robot. In attempting to provide the comfort she believes it needs, she comforts herself. Because of my training as a clinician, I believe that this kind of moment, if it happens between people, has profound therapeutic potential. We can heal ourselves by giving others what we most need. But what are we to make of this transaction between a depressed woman and a robot? When I talk to colleagues and friends about such encounters-for Miriam's story is not unusual-their first associations are usually to their pets and the solace they provide. I hear stories of how pets "know" when their owners are unhappy and need comfort. The comparison with pets sharpens the question of what it means to have a relationship with a robot. I do not know whether a pet could sense Miriam's unhappiness, her feelings of loss. I do know that in the moment of apparent connection between Miriam and her Paro, a moment that comforted her, the robot understood nothing. Miriam experienced an intimacy with another, but she was in fact alone. Her son had left her, and as she looked to the robot, I felt that we had abandoned her as well.

Experiences such as these-with the idea of aliveness on a "need-to-know" basis, with the proposal and defense of marriage to robots, with a young woman dreaming of a robot lover, and with Miriam and her Paro-have caused me to think of our time as the "robotic moment." This does not mean that companionate robots are common among us; it refers to our state of emotional-and I would say philosophical-readiness. I find people willing to seriously consider robots not only as pets but as potential friends, confidants, and even romantic partners.

We don't seem to care what these artificial intelligences "know" or "understand" of the human moments we might "share" with them. At the robotic moment, the performance of connection seems connection enough. We are poised to attach to the inanimate without prejudice. The phrase "technological promiscuity" comes to mind. As I listen for what stands behind this moment, I hear a certain fatigue with the difficulties of life with people. We insert robots into every narrative of human frailty. People make too many demands; robot demands would be of a more manageable sort. People disappoint; robots will not. When people talk about relationships with robots, they talk about cheating husbands, wives who fake orgasms, and children who take drugs. They talk about how hard it is to understand family and friends. I am at first surprised by these comments. Their clear intent is to bring people down a notch. A forty-four-year-old woman says, "After all, we never know how another person really feels. People put on a good face. Robots would be safer." A thirtyyear-old man remarks, "I'd rather talk to a robot. Friends can be exhausting. The robot will always be there for me. And whenever I'm done, I can walk away."

The idea of sociable robots suggests that we might navigate intimacy by skirting it. People seem comforted by the belief that if we alienate or fail each other, robots will be there, programmed to provide simulations of love.

 

9 Our population is aging; there will be robots to take care of us. Our children are neglected; robots will tend to them. We are too exhausted to deal with each other in adversity ; robots will have the energy. Robots won't be judgmental. We will be accommodated. An older woman says of her robot dog, "It is better than a real dog. . . . It won't do dangerous things, and it won't betray you. . . . Also, it won't die suddenly and abandon you and make you very sad."

10 The elderly are the first to have companionate robots aggressively marketed to them, but young people also see the merits of robotic companionship. These days, teenagers have sexual adulthood thrust upon them before they are ready to deal with the complexities of relationships. They are drawn to the comfort of connection without the demands of intimacy. This may lead them to a hookup-sex without commitment or even caring. Or it may lead to an online romance-companionship that can always be interrupted. Not surprisingly, teenagers are drawn to love stories in which full intimacy cannot occur-here I think of current passions for films and novels about high school vampires who cannot sexually consummate relationships for fear of hurting those they love. And teenagers are drawn to the idea of technological communion.

They talk easily of robots that would be safe and predictable companions.

11 These young people have grown up with sociable robot pets, the companions of their playrooms, which portrayed emotion, said they cared, and asked to be cared for.

12 We are psychologically programmed not only to nurture what we love but to love what we nurture. So even simple artificial creatures can provoke heartfelt attachment. Many teenagers anticipate that the robot toys of their childhood will give way to full-fledged machine companions. In the psychoanalytic tradition, a symptom addresses a conflict but distracts us from understanding or resolving it; a dream expresses a wish.

13 Sociable robots serve as both symptom and dream: as a symptom, they promise a way to sidestep conflicts about intimacy; as a dream, they express a wish for relationships with limits, a way to be both together and alone.

14 Some people even talk about robots as providing respite from feeling overwhelmed by technology. In Japan, companionate robots are specifically marketed as a way to seduce people out of cyberspace; robots plant a new flag in the physical real. If the problem is that too much technology has made us busy and anxious, the solution will be another technology that will organize, amuse, and relax us. So, although historically robots provoked anxieties about technology out of control, these days they are more likely to represent the reassuring idea that in a world of problems, science will offer solutions.

15 Robots have become a twentyfirst-century deus ex machina. Putting hope in robots expresses an enduring technological optimism, a belief that as other things go wrong, science will go right. In a complicated world, robots seem a simple salvation. It is like calling in the cavalry. But this is not a book about robots. Rather, it is about how we are changed as technology offers us substitutes for connecting with each other face-to-face. We are offered robots and a whole world of machine-mediated relationships on networked devices. As we in-stant-message, e-mail, text, and Twitter, technology redraws the boundaries between intimacy and solitude. We talk of getting "rid" of our e-mails, as though these notes are so much excess baggage. Teenagers avoid making telephone calls, fearful that they "reveal too much."

They would rather text than talk. Adults, too, choose keyboards over the human voice. It is more efficient, they say. Things that happen in "real time" take too much time. Tethered to technology, we are shaken when that world "unplugged" does not signify, does not satisfy. After an evening of avatar-to-avatar talk in a networked game, we feel, at one moment, in possession of a full social life and, in the next, curiously isolated, in tenuous complicity with strangers. We build a following on Facebook or MySpace and wonder to what degree our followers are friends. We recreate ourselves as online personae and give ourselves new bodies, homes, jobs, and romances. Yet, suddenly, in the half-light of virtual community, we may feel utterly alone. As we distribute ourselves, we may abandon ourselves. Sometimes people experience no sense of having communicated after hours of connection. And they report feelings of closeness when they are paying little attention. In all of this, there is a nagging question: Does virtual intimacy degrade our experience of the other kind and, indeed, of all encounters, of any kind?

The blurring of intimacy and solitude may reach its starkest expression when a robot is proposed as a romantic partner. But for most people it begins when one creates a profile on a social-networking site or builds a persona or avatar for a game or virtual world.

16 Over time, such performances of identity may feel like identity itself. And this is where robotics and the networked life first intersect. For the performance of caring is all that robots, no matter how sociable, know how to do. I was enthusiastic about online worlds as "identity workshops" when they first appeared, and all of their possibilities remain.

17 Creating an avatar-perhaps of a different age, a different gender, a different temperament-is a way to explore the self. But if you're spending three, four, or five hours a day in an online game or virtual world (a time commitment that is not unusual), there's got to be someplace you're not. And that someplace you're not is often with your family and friends-sitting around, playing Scrabble face-to-face, taking a walk, watching a movie together in the old-fashioned way. And with performance can come disorientation. You might have begun your online life in a spirit of compensation. If you were lonely and isolated, it seemed better than nothing. But online, you're slim, rich, and buffed up, and you feel you have more opportunities than in the real world. So, here, too, better than nothing can become better than something-or better than anything. Not surprisingly, people report feeling let down when they move from the virtual to the real world. It is not uncommon to see people fidget with their smartphones, looking for virtual places where they might once again be more.

Sociable robots and online life both suggest the possibility of relationships the way we want them. Just as we can program a made-to-measure robot, we can reinvent ourselves as comely avatars. We can write the Facebook profile that pleases us. We can edit our messages until they project the self we want to be. And we can keep things short and sweet. Our new media are well suited for accomplishing the rudimentary. And because this is what technology serves up, we reduce our expectations of each other. An impatient high school senior says, "If you really need to reach me, just shoot me a text." He sounds just like my colleagues on a consulting job, who tell me they would prefer to communicate with "real-time texts." Our first embrace of sociable robotics (both the idea of it and its first exemplars) is a window onto what we want from technology and what we are willing to do to accommodate it. From the perspective of our robotic dreams, networked life takes on a new cast. We imagine it as expansive. But we are just as fond of its constraints. We celebrate its "weak ties," the bonds of acquaintance with people we may never meet. But that does not mean we prosper in them.18 We often find ourselves standing depleted in the hype. When people talk about the pleasures of these weak-tie relationships as "friction free," they are usually referring to the kind of relationships you can have without leaving your desk. Technology ties us up as it promises to free us up. Connectivity technologies once promised to give us more time. But as the cell phone and smartphone eroded the boundaries between work and leisure, all the time in the world was not enough. Even when we are not "at work," we experience ourselves as "on call"; pressed, we want to edit out complexity and "cut to the chase."

Reference no: EM131046202

Questions Cloud

Logical flow of ideas and treatment : Explain and analyse what the investment bank adviser means if he says that such a bond will allow the investor to convert capital to income
Example of a working virtual circuit whose path traverses : Packets sent along this path should not, however, circulate indefinitely.
Is it still necessary to wait one rtt before data is sent : Show that it is also possible for each switch to choose the VCI value for the outbound link, and that the same VCI values will be chosen by each approach. If each switch chooses the outbound VCI, is it still necessary to wait one RTT before data i..
Describe how change will be evaluated by the organisation : Briefly describe how the change was/will be evaluated by the organisation and the timeline for this evaluation? If your organisation has no evaluation process in mind, you may suggest one with an appropriate timeline
Is the development of robotics helpful or harmful : Some people believe that the use of robotics is advancing our society. Others contend that such developments are changing society negatively. What is your position on this issue? Is the development of robotics helpful or harmful?
Senior executive at siemens company : One day in 2004, a senior executive at Siemens Company said he received a disturbing phone call from a Saudi Arabian businessman. The caller said he represented a Saudi consulting firm that had been a business partner of Siemens.
Indicate which ports are not selected by the spanning tree : Given the extended LAN shown in Figure 3.37, indicate which ports are not selected by the spanning tree algorithm.
What countermeasures would you employ : What areas of intelligence analysis do you think would be appropriate based on the terror organization and the selected target of the terror group? Explain.
Which ports are not selected by the spanning tree algorithm : Given the extended LAN shown in Figure 3.37, assume that bridge B1 suffers catastrophic failure. Indicate which ports are not selected by the spanning tree algorithm after the recovery process and a new tree has been formed.

Reviews

Write a Review

Other Subject Questions & Answers

  Cross-cultural opportunities and conflicts in canada

Short Paper on Cross-cultural Opportunities and Conflicts in Canada.

  Sociology theory questions

Sociology are very fundamental in nature. Role strain and role constraint speak about the duties and responsibilities of the roles of people in society or in a group. A short theory about Darwin and Moths is also answered.

  A book review on unfaithful angels

This review will help the reader understand the social work profession through different concepts giving the glimpse of why the social work profession might have drifted away from its original purpose of serving the poor.

  Disorder paper: schizophrenia

Schizophrenia does not really have just one single cause. It is a possibility that this disorder could be inherited but not all doctors are sure.

  Individual assignment: two models handout and rubric

Individual Assignment : Two Models Handout and Rubric,    This paper will allow you to understand and evaluate two vastly different organizational models and to effectively communicate their differences.

  Developing strategic intent for toyota

The following report includes the description about the organization, its strategies, industry analysis in which it operates and its position in the industry.

  Gasoline powered passenger vehicles

In this study, we examine how gasoline price volatility and income of the consumers impacts consumer's demand for gasoline.

  An aspect of poverty in canada

Economics thesis undergrad 4th year paper to write. it should be about 22 pages in length, literature review, economic analysis and then data or cost benefit analysis.

  Ngn customer satisfaction qos indicator for 3g services

The paper aims to highlight the global trends in countries and regions where 3G has already been introduced and propose an implementation plan to the telecom operators of developing countries.

  Prepare a power point presentation

Prepare the power point presentation for the case: Santa Fe Independent School District

  Information literacy is important in this environment

Information literacy is critically important in this contemporary environment

  Associative property of multiplication

Write a definition for associative property of multiplication.

Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd