The Making of 31 Days

Super 8 centred in an HD frame

I placed the 31 seconds of 4:3 edited footage in the centre of an HD frame, preserving the image quality by having its pixel size at a 1:1 ratio. I felt the image seemed a little ‘lost’ in the HD frame and would likely be smaller than other work in the compilation.

Arrangement of panes across the HD canvas.

In search of a solution that would use more screen-space, I then composited another two frames across the 16:9 HD screen. As seen in Figure 21, the outer two SD frames extend beyond the edges of the HD frame, shown as a black rectangle.

One Second a Day compositing with one-second offset on each pane.

I chose this arrangement in preference to having three smaller complete panes. Offsetting the footage in each pane by one second from right to left displayed subsequent clips in the three panes. The three video tracks in Figure 22 above are the right, centre and left panes, reading from top to bottom. When playing, each one-second clip appears partially on the right-hand pane, then fully visible in the centre pane, then partially visible on the left-hand pane, alluding to past, present and future across the composition, reading left-to-right.

One Second a Day film still – three Super 8 frames across an HD screen

The footage used in One Second a Day is silent. To create a soundtrack, I recorded a Super 8 film projector being started and running, syncing this audio to the moving images making an auditory connection to the performance of film projection. The machine used to digitise the footage was almost silent, its operation not disturbing the hushed, reverential space of the telecine suite in Soho. The sound of the Chinon projector being operated in my work room brings the footage home – literally – as the progress of a 50ft reel through the machine is sonically coloured by that space. Brandon LaBelle describes his act of clapping in a room as “more than a single sound and its source, but rather a spatial event” that is “inflected not only by the materiality of space but also by the presence of others, by a body”(LaBelle, 2015, p.xii). Recording the audio in a domestic-sized room with me standing and holding the microphone, shapes the audio, and to me, it speaks of the times when I first saw each reel freshly returned from processing.

Super 8 cameras usually have automatic exposure, so filming can be freewheeling in execution: frame, focus and press the trigger. The cost of the film and processing encourages frugality, so individual shots are fairly brief. Despite the consequent brevity of most of the filming – meaning more frequent in-camera edits – 30 of the one-second selections contain a single shot. Close inspection of One Second a Day reveals that a single one-second clip has an on-screen edit within it that then repeats across the three panes, shown in Figure 24. 

An existing edit in a one second clip

The short film was compiled with the other students’ videos to form a collection with a running time of approximately eight minutes duration, entitled 3D3: 1 Second a Day Showreel. The compilation is displayed by 3D3 on its website.

My Super 8 films had undergone a complicated series of remediations on their passage to include in the group project. The film frames were mostly shot at 18 frames per second, digitised to standard definition video at 50 interlaced fields per second, colour-corrected by a telecine operator and recorded to Sony DigiBeta half-inch digital tape. This tape was played out and captured to hard disk, then de-interlaced and placed in a 25 progressive frames-per-second high-definition video editing project. The SD telecine and video edit used ProRes codec QuickTime files with 4:4:2 colour subsampling. When the edit was complete, the subsampling was downgraded to 4:2:0 and compressed using the H.246 codec for playback on an LCD television.

(Re)Remediation

Kodachrome Super 8 frame

Sea Front serves as a case study for the remediation choices that were made for the Super 8 practice experiments that follow. The first digitisation was a home telecine using a standard definition miniDV digital video camera to film the projected image. As described earlier, I set the projector to 18 fps and slowed it slightly to reduce flicker in the captured video. The editing was carried out in its native 50i interlaced format. The finished film won an award at the London Short Film Festival, despite, or perhaps because of, the ‘unprofessional’ low resolution rendition of the Super 8 footage which pulses – an effect caused by the frequency of the rotating projector blades.

Digitised Super 8 frames

A screen grab from this home transfer is seen on the left in Figure 19. The professional telecine conducted by Deluxe in London, described is in the centre. Although this was standard definition (SD), it was more detailed and did not exhibit the previous iteration’s pulsing. I recreated the video edit of Sea Front using the Deluxe scan that was carried out at 25 fps. During editing I slowed the footage to preserve the ‘feel’ of the first award-winning edit. Technology has developed rapidly making high quality scanning much more affordable, but it would still be expensive and time-consuming to re-digitise all my archive at HD or higher resolution. I did take a selection of Super 8 material to Bristol in 2019 for an HD scan where each film frame is captured to a progressive video file. The Super 8 film passed in front of a lens on the scanner, rather than across the physical gate of a projector or telecine machine. This process is gentle on the physical film and allows the whole width of the film to be scanned as shown in Figure 18 or zoomed in to the picture area as in the right-hand image in Figure 19.

Womad (2016) was edited using the Deluxe SD telecine footage and was screened on Sony Cube monitors. These obsolete 4:3 ratio CRT televisions displayed the SD material at its best. The difficulties of screening SD interlaced video on an HD flatscreen, and the strategies to cope with the mismatch are discussed in Skimming the Archive on pages 93-98.

31 Days vs The News

Multi-screen editing

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. Figure 2 shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing.

Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed.

When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. The figure above shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing. Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed. When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Live television news broadcasting is an everyday example of example of hypermediacy that viewers may access via several screen-based devices, such as mobile phones, tablets, laptops, computer monitors and televisions, which can be used in a range of settings, from the domestic to workplace and outdoor public spaces. Within the frame of the news programme, the viewer has several ‘features’ vying for attention: for example, the studio-based lead journalist or anchor, live or recorded footage from the location of the story that may include a reporter, interviewees and other filming, a live discussion with guest commentators who may be present in the studio or elsewhere, a window in which the miniaturised sign language interpreter translates what is said for D/deaf people, subtitling for people who are hearing-impaired and others watching on a muted television, various text-based news crawls that slide across the bottom of the screen, and the logo of the television company. National television weather forecasts sometimes present QR codes on-screen that once scanned allow viewer to view more localised meteorological information on a mobile phone.

Personal vs Institutional

Some personal archives may find their way into institutional archives, through donation by the family, and so may come to form part of our social memory – but how ‘safe’ is this material? Many institutional film archives, such as the British Film Institute archive have digitised parts of their collection to make the material more accessible to the public and for educational and research purposes – and have preserved the original celluloid material. However, other archives may have only digital copies of the film material they hold. Will the ‘thin images’ of incidental moments of day-to-day lived experience be weeded out from the public archive?

Laura U Marks’ comment seems relevant here: “The less important the film or tape (and by extension, its potential audience) was considered, the less likely that it will have been archived with care” (Marks, 2002, p. 169) – or, I suggest, be considered insignificant to the construction of our shared, socio-cultural history by a dutiful archivist who may remove the material.

The Death of Film

The early 21st century has seen the ‘death of film’ where commercial feature film production and projection has moved wholesale from celluloid to digital, following the transition in broadcasting and other moving image production away from film via analogue video to a wholly digital workflow. Cinemas across the UK transitioned from 35mm projection to the Digital Cinema Initiatives (DCI) technical architecture proposed in 2002, projecting films from data collected in Digital Cinema Package (DCP) format. The three major manufacturers, Panavision, ARRI and Aaton quietly ceased production of their film cameras. Bill Russell of ARRI explained to journalist and industry commentator Debra Kaufman:

The demand for film cameras on a global basis has all but disappeared. There are still some markets–not in the U.S.–where film cameras are still sold, but those numbers are far fewer than they used to be. If you talk to the people in camera rentals, the amount of film camera utilization in the overall schedule is probably between 30 to 40 percent. (Kaufman, 2011)

ARRI wholly transitioned into manufacturing a hugely successful range of digital cinematography cameras, having only produced film cameras to special order from 2009. The founder of the French manufacturer Aaton, Jean-Pierre Beauviala, added “Almost nobody is buying new film cameras. Why buy a new one when there are so many used cameras around the world?” (Ibid.) Beauviala’s comment acknowledges that there were still users of professional film cameras who could purchase used examples, and this remains true today. Remarkably, commercial analogue movie production is still viable in the third decade of the 21st century, with the necessary resources: stock, cameras, processing and so on.

At the same time, there has been a grassroots revival in celluloid filmmaking with practitioners using the technology for its inherent qualities and associations. British filmmakers Ben Rivers and Mark Jenkin have had success with 16mm production using hand-processing of their film stock, the latter winning a 2020 BAFTA for Outstanding Debut by a British director for Bait (Jenkin, 2019), a feature-length film shot with black and white stock using a vintage hand-wound Bolex SB 16mm camera. Noted British visual artist Tacita Dean continues to work with film and campaigns for its survival, helping to steer SAVEFILM.ORG

Bill Douglas

In her article, ‘Memory Texts and Memory Work: Performances of Memory in and with Visual Media’, Annette Kuhn explores whether the Scottish filmmaker Bill Douglas’s Trilogy, made between 1972 and 1978 is “autobiographical” (Kuhn, 2010, p.3). She suggests that it is difficult to sustain a “first person voice” in cinema because the ‘I’ position “of literary autobiography” – in which the narrator, author and protagonist are the same person – does not translate to cinema” (Ibid., pp.3, 5). This interests me because Kuhn goes on to describe Douglas’s Trilogy being as much “authoethnographic” as they are “autobiographical” because the films reflect “from the inside” the filmmaker’s impoverished and traumatic childhood. She considers that these films, “speak from a place of otherness that sits well with the impossibility of cinema’s point of enunciation being ‘pinned down with any certainty’”, and identifies “personal experimental cinema” as exploiting this with its specificities of fragmentary montage and “non-linear temporality” (Ibid., p.5). From Kuhn’s discussion, I perceive that my filmmaking can be seen to belong within ‘personal experimental cinema’ – or ‘personal cinema’ – with Finborough Road, Returning to Dungeness and Father-land amplifying the ‘personal’ aspect through their use of the ‘first person voice’, and, further, that Father-land could be viewed as an autoethnographic work.

Kuhn, A. (2010) Memory Texts and Memory Work: Performances of Memory in and with Visual Media. Memory Studies, XX, 

The Continued Use of Celluloid Film

Discussion of analogue film-based filmmaking is a growing discourse in academia. In 2021, the University of Aberystwyth held a well-attended five-day conference The Shifting Ecologies of Photochemical Film in the Digital Era. The promotional material stated: 

Since the early 2000s, the medium of film has moved from a dominant to a marginal position as digital technology has rendered photochemical and mechanical technology obsolete on a commercial scale. Despite this, artists are finding new ways to work with film, forging alternative communities, pursuing artisanal methods, repurposing abandoned equipment, reinventing primitive techniques and exploring new hybrid technologies. (Aberystwyth University, 2021)

The same institution has developed a film workshop where students can access film equipment and integrated its use into academic programmes.

Beyond academia, there is a vibrant community of experimenters and instructors supporting the use of film by artists. Dr Joanna Mayes in St Ives teaches techniques for processing film using developers made from plants foraged from the sites where the film was exposed. James Holcombe provides technical support to filmmakers and organisations from his garden laboratory in Frome, Somerset. Holcombe recently became the UK agent for ORWO film, an East German manufacturer that survived the economic shock of the unification with West Germany. ORWO produces black and white film and has stated that it will produce colour 16mm stock soon.

Karel Doing explains in a journal article how he took his research into direct animation of plant material on unexposed film-stock – dubbed phytography – from his PhD out to a community spread around the world by running workshops:

I have been extending my phytography practice by organizing workshops, which have been affected by local flora, climate and participants. During these workshops, technique and context are discussed, experiments are set up, images are produced and, finally, results are projected, sometimes with multiple projectors. (Doing, 2020)

Social media has provided a means for ‘analogue filmmakers’ to share knowledge and promote photochemical practices. Facebook groups exist for various niche interests: Aaton Anonymous has a wealth of knowledge about this camera marque, Bolex Camera Users help each other use their clockwork cameras. Instagram has numerous posts labelled with film-specific tags, such as #16mm – currently appended to 357,940 posts, #super8 – currently appended to 312,634 posts.

https://www.instagram.com/explore/tags/16mm/ on 26/06/23
https://www.instagram.com/explore/tags/Super8/ on 6/06/23

Filming on Film

The Passage of Time

There are complications, or at least considerations, regarding frame rates of filming and projection or transfer by telecine of Super 8 footage. Super 8 cameras typically run at 18 frames per second (fps) for silent filming and 24 fps for synchronised sound or to conform to the professional motion picture standard. Silent frame rates have survived from the early days of cinema when 16 or 18 fps sufficed to create a naturalistic rendition of motion while using the minimum of film stock for acquisition, editing and, most importantly, distribution and projection. The frame rate was standardised upwards to 24 fps for ‘talkies’ in the 1930s to reproduce the soundtrack with greater fidelity. 24 fps remains the benchmark for narrative filmmaking and projection to the present day, despite the production process typically being completely digital.

The Super 8 filmmaker wanting to reproduce naturalistic motion has a choice of setting the camera either to 18 or 24 fps. Either filming speed gives a satisfactory experience for the viewer. Consequently, I ran most of the footage in my archive through the camera at the slower – and more frugal – frame rate, allowing for an extra 50 seconds of filming on each roll.

Home Transfer

The method I often used to transfer film to video was to project the film on to a white wall and re-film it on video in a darkened room, a common ‘cut price’ approach to telecine. The results were often technically flawed but rendered the footage in a style quite different to the sterility of video. My interest in still photography predates my work as a filmmaker. Perhaps this is why the first viewing of the slowed-down footage of Derek Jarman’s Gerald’s Film (Jarman, 1976) resonated so strongly. Its dreamlike quality shimmered in a space between still and moving image, the unstoppable present of cinema subverted by the photographic readability of successive frames. Jarman’s early Super 8 work is a personal cinema – a documenting of his life – where this technique of filming at 6 fps then projecting at 3 fps to manipulate time both distances the viewer from a naturalistic rendition of the profilmic space, but paradoxically offers a closer, more personal and human experience.

Professional Telecine

In 2011, the Super 8 rushes that constitute my digital archive were professionally transferred to DigiBeta on a telecine machine at Deluxe Soho in London at 25 fps. This ‘projection’ rate has the benefit of matching each frame of film to a frame of video but accelerates movement to a degree – slight for 24 fps material but noticeable for footage filmed at 18 fps. In 1987, the prospect of having my Super 8 footage telecined in Soho over 20 years later – and consequently setting the camera at 24 fps to pre-empt the issues from filming at 18 fps – would have seemed fanciful. These considerations have been rendered less important by advances in non-linear editing, which allow footage to be slowed down to match its ‘natural’ cadence on the timeline. Advancing technology has allowed digitisation at much higher quality on cheaper machines and thereby be given ‘new life’, in contrast to contemporary productions shot on video whose quality was fixed in standard definition.

Material Particulars

Reversal film, such as Kodachrome 40, is exposed in the camera before being processed to produce a strip of positive pictures. In the technical sense, these are not negative images where light areas of the scene are rendered as dark areas in the film emulsion and colours are reversed. Essentially, a long strip is produced of what are commonly described as ‘colour slides’ or ‘colour transparencies’ in the world of stills photography. Each frame on the 50-foot reel of Super 8 film is a tiny photographic image of approximately 4mm by 5.8mm – small but readable by the naked eye, particularly if the images are of high contrast, such as trees against the sky. To get a better view of individual frames, a photographic loupe is required and some form of film transport, such as an editor/viewer or projector, if the film is to be experienced with the illusion of movement.

Celluloid photography and filmmaking have enjoyed a modest, artisanal renaissance in recent years. For some, its appeal might be as a craft practice with pleasure derived from choosing and buying film material, loading the camera in a semi-darkness, deploying vintage exposure meters and other arcana. A cynic might view this ‘return to film’ as nostalgia, the re-enactment of an imagined analogue past. Others, such as the American filmmaker, Robert Schaller, have a long-standing commitment to analogue practice.

He also designs and builds 16mm cameras, creates his own emulsion and produces projection prints. Unlike Schaller, most ‘analogue filmmakers’ use digital editing, although a few die-hards still cut their negs and have a projection print struck with optical sound embedded or a mag stripe coated on. The Super 8 reversal material in my archive was processed commercially. Kodachrome required a highly complex development by the laboratory, even the Agfa and Ektachrome stocks’ E6 processing was a complicated procedure.

The photographic rendition of the Kodachrome emulsion is commonly held to connote ‘the past’ and has a widespread cultural resonance. Paul Simon sings in ‘Kodachrome’ (1973):

Kodachrome
They give us those nice bright colors
They give us the greens of summers
Makes you think all the world’s a sunny day
I got a Nikon camera
I love to take a photograph
So mama don’t take my Kodachrome away

For generations of people around the world, their lives were documented on Kodachrome, both as colour slides and Super 8 ciné. Eastman Kodak ceased manufacturing Kodachrome 40 in 2006 (Kodak, 2022). In the 30-minute documentary, The End of an Era, a National Geographic film crew follows the photographer Steve McCurry as he shoots “the very last roll of Kodachrome to come off the assembly line” and processes the film at Dwayne’s Photo – McCurry had worked extensively for National Geographic as a freelancer and has “over 800 thousand” Kodachrome transparencies in his archive (Russo and Weise, 2013). In the 2017 feature film Kodachrome (Raso, 2017), an A&R music executive accompanies his estranged and dying father, a well-known photojournalist, on a road trip from New York to Parsons, Kansas, to process his last four rolls of Kodachrome film before the sole remaining film processing facility, Dwayne’s Photo, closes. The film navigates a world changing from analogue to digital as the protagonists try to put the past behind them. People associate the ‘look’ of Kodachrome stock – its aesthetic – with the past and their memories of holidays, everyday activities, and family occasions. We see the association of Kodachrome with the past in Wim Wenders’ Paris Texas (Wenders, 1984) in which the protagonist – Travis Henderson, played by Harry Dean Stanton – is reintroduced to his past by the projection of Kodachrome home movies. My Super 8 footage had these material connotations of being of the past when it was first viewed, more now that it has aged.

The 50 feet (15m) Super 8 cartridge will film for 3 minutes 20 seconds at 18 fps but only 2 minutes 30 seconds at 24 fps.

I shot the footage used in Sea Front (Moore, 2010) at 24 fps – unusual for my Super 8 work. Initially, the transfer was at the matching 24 fps, but everything seemed to move too quickly on screen. The second transfer to digital was at around 16 2/3 fps, achieved by setting the projector at 18 fps and using its speed adjuster to lower the frame rate. The slower, more relaxed interpretation seemed to be more natural, more fitting, closer to the 

Zapruder’s Frames

An example of 8mm film being examined in minute detail is the filming by Abraham Zapruder of John F. Kennedy’s assassination in 1963. Zapruder’s film, with its shot of the motorcade inscribed onto the tiny Standard 8 frames, is an unbroken stream of vérité. The Standard 8 camera film is 16mm wide so when it was processed, split and re-joined into the 50ft reel, was any vital evidence lost? US Secret Service Investigators found his clockwork camera ran at 18.3 fps. allowing them to create a timeline of events. They then compared the footage to Zapruder’s previous home movies to identify anomalies such as first frame flashes that would show when the camera had stopped and started. This forensic scrutiny of his 8mm footage also highlighted the camera’s propensity to create visual information outside of the film frame as defined by the camera’s gate that, given the immensity of the event recorded, received scrutiny perhaps never before applied to photographic artefacts outside the picture area of a film, in tandem with the photographic evidence held within the frame. The Warren Commission report on the shooting claimed, “Of all the witnesses to the tragedy, the only unimpeachable one is the camera of Abraham Zapruder” (Bruzzi, 2006, p.20). However, Sean Cubitt observes:

Perhaps the most famous of all news footage, the Zapruder film of the Kennedy assassination, shown on North American television for the first time only in 1973, is exemplary. No longer news, it had already become talismanic. Slowed down, blown up, and most of all repeated, the Zapruder film stands head and shoulders above similar moments, like the footage of the Rodney King beating, not because the event was more spectacular or significant, but because its meanings remain so profoundly uncertain. The film is evidence, but despite its forensic standing, it is inconclusive. (Cubitt, 2004, p.210)

Zapruder’s camera could not record sound so there were no auditory gunshots, and the cameraman was facing the motorcade, rather than inadvertently filming the shooter/s to his rear.

Super 8 – reel memories

This page includes a discussion of the now-obsolete Kodak Kodachrome colour reversal film stock, which makes up a large part of my Super 8 collection, and the way this emulsion renders colour that has become a byword for a nostalgic representation of the past (Hill, 2021).

Super 8’s predecessor, Standard 8, required a level of skill and dexterity to load a roll of film successfully. 16mm cameras often required greater technical know-how to load and operate and the cameras were more expensive, larger and the film stock was more expensive. Super 8 was designed to remove these complicated procedures that created hurdles for the novice filmmaker.

Early Super 8 cameras were typically battery-driven with basic controls; the user had only to insert a cartridge, turn on the camera, aim, press the trigger and repeat until the cartridge was used up. As the format became popular, manufacturers produced more sophisticated cameras which afforded greater control and quality for an enthusiast owner. The upshot of these innovations was that I could achieve pleasing results with little experience and no mentoring in using a film camera. Sarah Turner describes how the accessibility of the format and the spontaneity it engendered influenced her early filmmaking at art college:

I was working on 16mm, and we were taught its full properties, including emulsion, light meters, stocks, ASAs and so on, whereas Super 8 is primarily a domestic format with automatic exposure. Super 8 was largely regarded as a ‘sketchbook’ medium and that made it popular when I was at the Slade. (Turner, 2014, p.83)

She goes on to explain how using Super 8 offers an “alchemy of mistakes” giving rise to results that could be disastrous but could be magical because of “happy accidents” (Ibid., p.84).

My interest in still photography influenced how I engaged initially with celluloid filmmaking in the 1980s, with images taking precedence over audio. I had been involved with video production for several years, where sound was almost always recorded to videotape, along with pictures. As I continued to shoot Super 8, I bought a Canon 814, whose film chamber would only accept silent cartridges, which had the bonus of being less expensive than the sound-striped emulsions. Purchasing a ‘silent’ camera dictated that the collection grew without sync sound.

Super 8’s successor for home movies – camcorders recording to a variety of video formats – would all capture sound alongside moving images to play back on domestic television sets, either by connecting the camcorder with cables or by playing the tape back in the videocassette recorders that were becoming commonplace in the home in the 1980s (Mulvey, 2005, pp.22, 101).

As discussed below, most Super 8 users were reluctant to edit their movies, but those that recorded audio faced an extra challenge. It is telling that Super 8 sound film cartridges were some of the first to be discontinued as home video technology, with its longer running times and much cheaper media, superseded film as a more convenient way to record family life, while silent Super 8 film stocks remain in production.

I had known no ‘analogue’ filmmakers, either professional or home moviemakers, and my only exposure to film apparatus were some 16mm educational film shows in school, where the schoolmasters – they were all male – who fiddled with the projector at the rear of the room never broached the subject of film reels and lacing much less the processes that created the films. I had watched films in cinemas, but the mechanics – hidden away in the projection boxes – had remained a mystery. Also, the tiny 50ft filmstrip of Super 8 images seemed a world away from the ‘proper’ 35mm cinema, whose feature films came in huge flight cases dropped off by couriers, a few floors below the video workshop where I worked in Plymouth Arts Centre.

I was unaware of the disdain a large section of the filmmaking world held for the newly accessible video technology with its blurry low-resolution imagery that degraded with each copy. This copying was inherent in analogue tape-to-tape editing, so what little technical quality was present in the first-generation rushes diminished on the second generation edit master and again on third generation screening copies. Celluloid had associations with cinema and high-end television and music video production, but my efforts seemed remote from such considerations. Any desire to experience the films in their best ‘natural’ state through projection was absent beyond ‘seeing what was there’ when the film came back from the processor. The reels in my collection were viewed, then returned to storage with no thoughts of physically editing them or having public projection events. Now, years later the question is raised – if the 50ft film roll has remained uncut for decades, is it not by default a ‘finished film’?